Clamping the DC side of the amp would tell you how much power the amplifier is drawing from the electrical system, not outputting to the speaker. When someone "clamps" an amp they are measuring the output side of the amp, which is AC. They normally keep a DMM on the power input side to monitor what the input voltage is dropping to, but that's not how they are deriving their power output figures. Anyways.....many people fall under this delusion that they are performing some meaningful act by "clamping" their amplifiers. As Crazy said, they connect their amplifier to a load (normally just a subwoofer in most cases) and play a test tone. They then use a DMM to measure the voltage output from the amplifier and a clamp meter to measure the current on the output side of the amplifier while playing the test tone. They then use basic ohms law to calculate power (Voltage * Amps = Power) and the impedance of the load (Voltage/Amps = Resistance). As I previously mentioned, most people will also use a DMM on the power input side of the amplifier to monitor the voltage drop the electrical supply is experiencing. The problem is, 99% of the time people are simply wasting their time. It's a mostly meaningless endeavor that has been perpetually (and incorrectly) promoted on the internet as having actual relevance. There are several problems with this method. Along with being able to question the accuracy of the measurements themselves for various reasons (accuracy of the devices, the type of measurement being conducted, the varying impedance of the load, the varying stability of the supply, etc), many people try to compare these "clamp test" results to the manufacturer rated power. The problem is, the manufacturer's rated power is specified at a certain distortion level. Nobody performing these clamp tests are measuring distortion. One could make any amplifier appear highly underrated very easily in one of these "clamp" tests as there is a total disregard of distortion.