Impedance changes based on frequency, enclosure, temperature of the voice coil, etc etc. But the goal of the DMM method isn't to set the gain based on the impedance of your subwoofer. What you are attempting to accomplish with the DMM method is setting the gain based on the output voltage of the amplifier at a known power output at a known impedance, namely the impedance the amplifier's output power was rated at. If the amplifier is rated 100 @ 4ohm and you are attempting to set the gain....in theory it doesn't matter if your subwoofer is 4ohm, 3ohm, 2ohm, 1.3ohm, 3.6ohm, etc. What you need to know is what rail voltage the amplifier should be operating at, and adjusting the gain control of the amplifier to achieve that rail voltage. Ideally the output voltage of the amplifier would actually be independent of impedance. If the amplifier is rated at 100w @ 4ohm, that means the amplifier is capable of operating with a 20v rail voltage. In a perfect world, this voltage (20V) would remain the same no matter what impedance load was connected to the other side. The actual power output would depend on the load, as that determines the current output and ultimately the power output, but the rail voltage itself wouldn't change. So, in a perfect world, all we need to know is where the output voltage of the amplifier should be, and we determine this based on the rated power (which is a specific power output into a specific impedance load) and not what the impedance of the load is that is actually being connected. In practice it does matter as there are some design features and losses that occur within an amplifier that affect it's ability to maintain that rail voltage at higher power/higher current output situations. In many amplifiers the output voltage will decrease as impedance decreases, which decreases it's power output (normally due to current output constraints, power supply constraints, etc etc). But, again, we can use the rated power to approximate where the rail voltage should be for the approximate nominal impedance of the load we are connecting. It won't be perfect, but it doesn't need to be as small differences won't be noticeable in either direction. Another issue is where rated power falls in relation to actual power output. The amp might be capable of more or less power than it's actually rated for into the specific impedance it was rated at. In all but the most extreme circumstances they will be different but not by an excessively large amount. That means the voltage setting could be a little high or a little low....but again, small differences won't matter. The larger issue for me is the level of the test tone. If you use a 0db test tone (which most people use/recommend), 90% of people will end up unhappy with the result and think the need to buy a more powerful amplifier. Since music hardly spends any time at 0db, you will have a very low average power output from the amplifier. With a -6db or -10db test tone (depending on circumstances), the DMM method is good enough to get the gain set to a level that is relatively safe and leave most people happy. The problem is, you could probably do just as well by setting it by ear....so why bother? There are other issues as well that overall just make setting the gain by ear the easiest and best solution a majority of the time.