Jump to content

Recommended Posts

I have been reading around on some build logs and people are saying that they have an amplifier that is "clamped". What does this mean? I have been googling around with no luck (so far).

Share this post


Link to post
Share on other sites

when someone CLAMPS an amp they mean well i think they mean they used a CLAMP meter, which measures A/C current (since the amp outputs a/c current to the sub) and used the current reading to find out the total output of the amp and the actual power the amp is outputting!

does that make sense? here take a look at this video notice the CLAMP he is using on the positive output side of the amp GOING to the sub! (the video's are download-able so you see what is going on)

Uniform Amp Clamp Test - SSA Car Audio Forum

Share this post


Link to post
Share on other sites

They're not clamping the AC, they're clamping the DC amperage that the amp is consuming on the power wires. They make special DC clamps. Most are familiar w/the AC clamps though. The DC amperage along with the voltage can tell you the out put of the amp. Other things have to be considered too.

Share this post


Link to post
Share on other sites

They're not clamping the AC, they're clamping the DC amperage that the amp is consuming on the power wires. They make special DC clamps. Most are familiar w/the AC clamps though. The DC amperage along with the voltage can tell you the out put of the amp. Other things have to be considered too.

Clamping the DC side of the amp would tell you how much power the amplifier is drawing from the electrical system, not outputting to the speaker. When someone "clamps" an amp they are measuring the output side of the amp, which is AC. They normally keep a DMM on the power input side to monitor what the input voltage is dropping to, but that's not how they are deriving their power output figures.

Anyways.....many people fall under this delusion that they are performing some meaningful act by "clamping" their amplifiers. As Crazy said, they connect their amplifier to a load (normally just a subwoofer in most cases) and play a test tone. They then use a DMM to measure the voltage output from the amplifier and a clamp meter to measure the current on the output side of the amplifier while playing the test tone. They then use basic ohms law to calculate power (Voltage * Amps = Power) and the impedance of the load (Voltage/Amps = Resistance). As I previously mentioned, most people will also use a DMM on the power input side of the amplifier to monitor the voltage drop the electrical supply is experiencing. The problem is, 99% of the time people are simply wasting their time. It's a mostly meaningless endeavor that has been perpetually (and incorrectly) promoted on the internet as having actual relevance.

There are several problems with this method. Along with being able to question the accuracy of the measurements themselves for various reasons (accuracy of the devices, the type of measurement being conducted, the varying impedance of the load, the varying stability of the supply, etc), many people try to compare these "clamp test" results to the manufacturer rated power. The problem is, the manufacturer's rated power is specified at a certain distortion level. Nobody performing these clamp tests are measuring distortion. One could make any amplifier appear highly underrated very easily in one of these "clamp" tests as there is a total disregard of distortion.

Share this post


Link to post
Share on other sites

They're not clamping the AC, they're clamping the DC amperage that the amp is consuming on the power wires. They make special DC clamps. Most are familiar w/the AC clamps though. The DC amperage along with the voltage can tell you the out put of the amp. Other things have to be considered too.

Clamping the DC side of the amp would tell you how much power the amplifier is drawing from the electrical system, not outputting to the speaker. When someone "clamps" an amp they are measuring the output side of the amp, which is AC. They normally keep a DMM on the power input side to monitor what the input voltage is dropping to, but that's not how they are deriving their power output figures.

Anyways.....many people fall under this delusion that they are performing some meaningful act by "clamping" their amplifiers. As Crazy said, they connect their amplifier to a load (normally just a subwoofer in most cases) and play a test tone. They then use a DMM to measure the voltage output from the amplifier and a clamp meter to measure the current on the output side of the amplifier while playing the test tone. They then use basic ohms law to calculate power (Voltage * Amps = Power) and the impedance of the load (Voltage/Amps = Resistance). As I previously mentioned, most people will also use a DMM on the power input side of the amplifier to monitor the voltage drop the electrical supply is experiencing. The problem is, 99% of the time people are simply wasting their time. It's a mostly meaningless endeavor that has been perpetually (and incorrectly) promoted on the internet as having actual relevance.

There are several problems with this method. Along with being able to question the accuracy of the measurements themselves for various reasons (accuracy of the devices, the type of measurement being conducted, the varying impedance of the load, the varying stability of the supply, etc), many people try to compare these "clamp test" results to the manufacturer rated power. The problem is, the manufacturer's rated power is specified at a certain distortion level. Nobody performing these clamp tests are measuring distortion. One could make any amplifier appear highly underrated very easily in one of these "clamp" tests as there is a total disregard of distortion.

Wow, that actually makes a lot of sense.

Share this post


Link to post
Share on other sites

when someone CLAMPS an amp they mean well i think they mean they used a CLAMP meter, which measures A/C current (since the amp outputs a/c current to the sub) and used the current reading to find out the total output of the amp and the actual power the amp is outputting!

does that make sense? here take a look at this video notice the CLAMP he is using on the positive output side of the amp GOING to the sub! (the video's are download-able so you see what is going on)

Uniform Amp Clamp Test - SSA Car Audio Forum

Sweet. Thanks

Share this post


Link to post
Share on other sites

question you say distortion level... what do you mean by this?

i know distortion means CHANGE

so how do you measure distortion and or how do you get a specific distortion level?

are you referring to the input side? as in hook it up to a battery then hook up the battery to a charger at the same time that way the input voltage level will stay nominal?

and or

the output side...like as the sub heats up the resistance (impedance) will change and depending on the head unit / tone generator used this will effect the amps performance?

I assume to you this answer is obvious but to me i am not 100% sure on of what you are say/typing!

thanks in advance

They're not clamping the AC, they're clamping the DC amperage that the amp is consuming on the power wires. They make special DC clamps. Most are familiar w/the AC clamps though. The DC amperage along with the voltage can tell you the out put of the amp. Other things have to be considered too.

Clamping the DC side of the amp would tell you how much power the amplifier is drawing from the electrical system, not outputting to the speaker. When someone "clamps" an amp they are measuring the output side of the amp, which is AC. They normally keep a DMM on the power input side to monitor what the input voltage is dropping to, but that's not how they are deriving their power output figures.

Anyways.....many people fall under this delusion that they are performing some meaningful act by "clamping" their amplifiers. As Crazy said, they connect their amplifier to a load (normally just a subwoofer in most cases) and play a test tone. They then use a DMM to measure the voltage output from the amplifier and a clamp meter to measure the current on the output side of the amplifier while playing the test tone. They then use basic ohms law to calculate power (Voltage * Amps = Power) and the impedance of the load (Voltage/Amps = Resistance). As I previously mentioned, most people will also use a DMM on the power input side of the amplifier to monitor the voltage drop the electrical supply is experiencing. The problem is, 99% of the time people are simply wasting their time. It's a mostly meaningless endeavor that has been perpetually (and incorrectly) promoted on the internet as having actual relevance.

There are several problems with this method. Along with being able to question the accuracy of the measurements themselves for various reasons (accuracy of the devices, the type of measurement being conducted, the varying impedance of the load, the varying stability of the supply, etc), many people try to compare these "clamp test" results to the manufacturer rated power. The problem is, the manufacturer's rated power is specified at a certain distortion level. Nobody performing these clamp tests are measuring distortion. One could make any amplifier appear highly underrated very easily in one of these "clamp" tests as there is a total disregard of distortion.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×