Jump to content
tejcurrent

How to determine amplifier output

Recommended Posts

I was told that every 1v drop you lose around 10% of power

Dependes entirely on the amp. Some won't change power output at all with a drop in input voltage, they will simply draw more current at the lower voltage to make the same power.

As far as CEA specs go, IMO worthless for bandwidth limited amps since CEA specs use 1kHz for the rating freq. Most Class D sub amps won't play above 250-300Hz. A 1kHz spec isn't really going to tell you anything there now is it.

Share this post


Link to post
Share on other sites
I was told that every 1v drop you lose around 10% of power

Dependes entirely on the amp. Some won't change power output at all with a drop in input voltage, they will simply draw more current at the lower voltage to make the same power.

As far as CEA specs go, IMO worthless for bandwidth limited amps since CEA specs use 1kHz for the rating freq. Most Class D sub amps won't play above 250-300Hz. A 1kHz spec isn't really going to tell you anything there now is it.

I never knew CEA used such a high frequency... I thought they were in place to prevent such unrealistic practices

Share this post


Link to post
Share on other sites

The real problem with standards are they aren't simple to measure and or be agreed upon. I don't really feel the CEA regulations did much of anything other than put a label that manufacturers can point to saying that they are compliant. Either way specs have been and will continue to be very loose in their definitions.

Share this post


Link to post
Share on other sites
The real problem with standards are they aren't simple to measure and or be agreed upon. I don't really feel the CEA regulations did much of anything other than put a label that manufacturers can point to saying that they are compliant. Either way specs have been and will continue to be very loose in their definitions.

And that's sad. That's also the reason I want to test the amps. As soon as I can get the equipment together I'm going to try it out.

To me it should be easy to set standards for class D sub amps. They are generally used for 100hz and below, so they should have to drop that 1khz to 100. The voltage should be something realistic, 14.4. Really all they should do is say, "what is this going to be used for"? (subs) "What is a good realisitic standard?" and then go off that. By establishing a standard that isn't much if any better than those used by individual amp manufacturers anyway it's just another way to mislead people.

Share this post


Link to post
Share on other sites

Voltage is not the only thing you have to measure if you want to make a standard.

Share this post


Link to post
Share on other sites

Frequency and voltage are only a tiny part of the whole equation though. You have to have an allowable limt for distortion, you have to define the test load, you really need to give the freq range over which the power rating is valid, with tolerance. CEA is a step in the right direction, but is still a far cry from the answer.

Power output is not the only thing that needs a standard with amps IMO either. A standard way to measure s/n that actually means something is important. For example the best s/n I remember Ken Pohlman ever getting on an amp he tested for CSR back in the day was the 80dB he got with the first gen Soundstream REF500. The current tester for CA&E (Cogent Labs) routinely get results well in excess of 100dB. Are the amps that much better? No. The answer lies in the basis of the test. Testing for CSR was referenced to 1W with the gain at maximum. This is worst case scenario for an amp but it is a VERY telling test. If noise is below audibility on this test it means you can actually use the full range of the gain control without worrying about line hiss and noise coming from the preamp. CA&E references s/n to full 4 ohm power with min gain. Let's make it a joke of a test why don't we?!? That number tells you that the amp is functional, nothing more, and the more powerful the amp is, the better chance it has do put ou a really high number here since noise voltage is a constant and the signal voltage is even higher that with a lowre powered amp. Using min gain also doesn't tell you anything about the quality of the preamp stage making the test doubly useless.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×