You aren't understanding the purpose behind setting the gain. When you set the gain, you are setting the input sensitivity of the amp (ideally) so that it operates at maximum rail voltage without exceeding that voltage. You aren't setting the gain based on the impedance of the load, that's simply one means by which to help figure out where to set the input sensitivity control. Almost every amplifier has a fixed amount of gain. If you drive the amplifier with 1V of input and it outputs 20V, then the amplifier has a 20:1 gain ratio. If the amplifier has a maximum operating rail voltage of 40V (400w @ 4ohm), then it needs 2V of input signal to achieve that level of output. But, almost every amplifier is also designed to be capable of operating at it's full power output with a wide range of input signals....hence, the gain knob. Setting this knob to a certain position manipulates the level of the input signal so that the input signal is at the proper level for the amplifier will operate at it's maximum rail voltage without trying to exceed that voltage level. So if you have a 4V headunit outputting the full 4V, and you set the gain control to the "4V" level, then essentially the amplifier is attenuating that input signal by 2V so that after the 20:1 gain the output voltage will be 40V, right where it needs to be. But wait.....if the knob doesn't have a "4V" mark on it's dial, how do I know where the 4V setting is? Well, that's where the chart comes in. Since we normally aren't given the rail voltage of the amplifier, we're forced to calculate it back out of the information we are given. So if we know the amplifier is designed to output 400w @ 4ohm, we now know that the amplifier's going to output 40V. So, we now know the target rail voltage to achieve for a "proper" setting of the input sensitivity is 40V. So we put on a test tone and turn the gain knob until we achieve 40V. Make sense? Now, I'm not saying this is the ideal method of setting the gain. For starters, it assumes the amplifier is capable of cleanly outputting exactly rated power. On an underrated or overrated amplifier, the gain setting may not be accurate. In addition, there is also an issue with the actual level of the input signal. Using 0db will keep the amplifier from clipping (since HU output will be at it's maximum), but music is almost never at that level so we end up reducing the actual average power we'll receive from the amplifier. Etc, etc, not worth rehashing right now, but you get the point. The gain setting is really independent of the actual load or impedance rise. If the amplifier is capable of 40V, that's not going to change with impedance (for amplifiers with non-regulated outputs)*. Hence why amplifiers are normally rated at, for example, 400w @ 4ohm, 800w @ 2ohm, 1600w @ 1ohm. If you run the numbers or look on the chart, you'll see that the voltage for each for is the same....40V. If your load rises to 8ohm, you can't try to set it for 400w @ 8ohm because you'll exceed the rail voltage of the amplifier. You would set the gain for 200w @ 8ohm, maintaining that 40V rail voltage. *EDIT: I should mention that it's possible and not atypical for an amplifier's output to not hold true to the "double each time impedance is halved" rule for several reasons. A weak power supply, internal losses, current capabilities of the outputs, etc will limit the ability of the amplifier to double output as impedance decreases. As a result, the voltage at lower impedances may decrease. Sundown 1200D as an example. sqrt(360*4) = 37.95V, sqrt(720*2) = 37.95V, sqrt(1200*1) = 34.64V.