Jump to content
steveo50

cea- 2006 compliant

Recommended Posts

soooo iv been on sonic and been looking for cea amps.. kicker most fostgate amp are cea compliant. most jl amps are not iv always heard they put out rated power and have the best sq. am i missing something sonic is wrong or im not informed correctly?

Share this post


Link to post
Share on other sites

cea- 2006 compliant really doesn't mean anything...

Share this post


Link to post
Share on other sites

ok iv heard it a few times, i was assuming it means it does rated power or almost rated power.

Edited by steveo50

Share this post


Link to post
Share on other sites
On May 28, 2003, the Consumer Electronics Association published standard CEA-2006, "Testing & Measurement Methods for Mobile Audio Amplifiers." This "voluntary" standard advocates a uniform method for determining an amplifier's RMS power and signal-to-noise ratio. Using 14.4 volts, RMS watts are measured into a 4-ohm impedance load at 1 percent Total Harmonic Distortion (THD) plus noise, at a frequency range (for general purpose amplifiers) of 20 Hz to 20,000 Hz. Signal-to-Noise ratio is measured in weighted absolute decibels (dBA) at a reference of 1 watt into 4 ohms. This applies to both external amplifiers and the amplifiers within in-dash receivers.

So... you expect to see 14.4v at all times when running an amplifier? Also 1% THD is inaudiable especially at sub bass frequencies, I don't recall how much to go up to hear it.. but its a lot..

Share this post


Link to post
Share on other sites

hum ok the thd was my next question. my car runs at about 14v to 13v never lower, but what you are saying the volt drop will be inaudible? even if i say it starts at 1200w then drops to idk 900w. im just asking idk if the watts drop that much or what ever.

edit:not saying i get 1200w i know i dont. just as a reference i figure my sub is getting about 500 to 650ish may be 700w

Edited by steveo50

Share this post


Link to post
Share on other sites
On May 28, 2003, the Consumer Electronics Association published standard CEA-2006, "Testing & Measurement Methods for Mobile Audio Amplifiers." This "voluntary" standard advocates a uniform method for determining an amplifier's RMS power and signal-to-noise ratio. Using 14.4 volts, RMS watts are measured into a 4-ohm impedance load at 1 percent Total Harmonic Distortion (THD) plus noise, at a frequency range (for general purpose amplifiers) of 20 Hz to 20,000 Hz. Signal-to-Noise ratio is measured in weighted absolute decibels (dBA) at a reference of 1 watt into 4 ohms. This applies to both external amplifiers and the amplifiers within in-dash receivers.

So... you expect to see 14.4v at all times when running an amplifier? Also 1% THD is inaudiable especially at sub bass frequencies, I don't recall how much to go up to hear it.. but its a lot..

There are a plethora of other reasons why its a joke. Any reference to a test setup without reference to uncertainty in the testing methods is not valid. I also don't see THD spelled out very well, nor input excitation, noise measurements and so on. The test had reasonable intentions but the reality of it is that it is still fairly easy to cheat on.

Share this post


Link to post
Share on other sites

In order to gain a noticable increase in output you need to gain at least 3dB. 3dB is the smallest a human can notice a change in output.

We can detect changes in amplitude much less than 3db. Under the most commonly cited test conditions we can detect a change in amplitude of 1db.

That said I'm not disagreeing with your premise.....small changes in amplifier power are not worth worrying about.

Share this post


Link to post
Share on other sites
On May 28, 2003, the Consumer Electronics Association published standard CEA-2006, "Testing & Measurement Methods for Mobile Audio Amplifiers." This "voluntary" standard advocates a uniform method for determining an amplifier's RMS power and signal-to-noise ratio. Using 14.4 volts, RMS watts are measured into a 4-ohm impedance load at 1 percent Total Harmonic Distortion (THD) plus noise, at a frequency range (for general purpose amplifiers) of 20 Hz to 20,000 Hz. Signal-to-Noise ratio is measured in weighted absolute decibels (dBA) at a reference of 1 watt into 4 ohms. This applies to both external amplifiers and the amplifiers within in-dash receivers.

So... you expect to see 14.4v at all times when running an amplifier? Also 1% THD is inaudiable especially at sub bass frequencies, I don't recall how much to go up to hear it.. but its a lot..

There are a plethora of other reasons why its a joke. Any reference to a test setup without reference to uncertainty in the testing methods is not valid. I also don't see THD spelled out very well, nor input excitation, noise measurements and so on. The test had reasonable intentions but the reality of it is that it is still fairly easy to cheat on.

I believe the test methodologies are spelled out in more exacting detail in the actual publication, but you have to purchase said publication for $58 to obtain that information. What's posted there is just a summary of the publication.

CEA-2006

Share this post


Link to post
Share on other sites

So... you expect to see 14.4v at all times when running an amplifier? Also 1% THD is inaudiable especially at sub bass frequencies, I don't recall how much to go up to hear it.. but its a lot..

CEA compliant is marketing hype.

I wouldn't call it marketing hype. And I think most people miss the original intent of the standard (see Julian's comments above).

It was meant to provide a standard by which amplifier measurements were to be conducted and the results/performance reported and advertised in order to provide an equal basis of comparison. It wasn't meant to "prove" what the delivered power in your vehicle would be (i.e. with less than 14.4V), it wasn't meant to indicate what level of distortion was audible, etc. But when you had some amps rated at 12.5V @ 1% THD and other amps rated at 17V @ 5% THD, or listing "max" power figures only, it was more difficult and confusing for customers to make an informed decision because all they would look at was the wattage amount and not how that wattage was determined. The CEA-2006 standard provided a standard basis of comparison by identifying a specific set of test conditions by which compliant amplifiers are measured and rated. The only reason 14.4V was chosen was because that was the only way headunit manufacturer's would agree to use the standard.

Is it perfect? No. But it might be better than no testing standard at all.

Is it going to be precisely indicative of the performance in your car? Well no, but non-CEA-2006 rating methods don't necessarily provide that information either.

Is it possible to cheat? Sure, but who says the non-CEA-2006 guys aren't lying and/or cheating too?

Is it "marketing hype"? Not really, it provides and equal basis for comparing two amplifiers which is something the industry didn't have previously. Logically speaking it has nothing to do with marketing as it's simply a test methodology that provides an equal basis for comparison. At some point we could call all amplifier ratings "marketing hype" in some manor.

That said I do understand why some manufacturer's chose not to participate. But I think most consumers don't actually understand the intent of the standard. It was never about marketing or the performance itself, it was about providing a level playing field by standardizing testing methods. Like I said, it's not perfect (really, what standards are?) and I do understand the issues with it, but atleast understand what it's intentions were and were not.

Share this post


Link to post
Share on other sites

The problem with the comments in this tread is that most of the comments are from people too young to remember when a Sony amp had 2400 watts screen printed on the face of it. Then you would get uninformed customers asking why the 1000 watt JL was $1000 and the Sony was $239. CEA2006 stopped that type of manufacturer marketing.

Share this post


Link to post
Share on other sites

So... you expect to see 14.4v at all times when running an amplifier? Also 1% THD is inaudiable especially at sub bass frequencies, I don't recall how much to go up to hear it.. but its a lot..

CEA compliant is marketing hype.

I wouldn't call it marketing hype. And I think most people miss the original intent of the standard (see Julian's comments above).

It was meant to provide a standard by which amplifier measurements were to be conducted and the results/performance reported and advertised in order to provide an equal basis of comparison. It wasn't meant to "prove" what the delivered power in your vehicle would be (i.e. with less than 14.4V), it wasn't meant to indicate what level of distortion was audible, etc. But when you had some amps rated at 12.5V @ 1% THD and other amps rated at 17V @ 5% THD, or listing "max" power figures only, it was more difficult and confusing for customers to make an informed decision because all they would look at was the wattage amount and not how that wattage was determined. The CEA-2006 standard provided a standard basis of comparison by identifying a specific set of test conditions by which compliant amplifiers are measured and rated. The only reason 14.4V was chosen was because that was the only way headunit manufacturer's would agree to use the standard.

Is it perfect? No. But it might be better than no testing standard at all.

Is it going to be precisely indicative of the performance in your car? Well no, but non-CEA-2006 rating methods don't necessarily provide that information either.

Is it possible to cheat? Sure, but who says the non-CEA-2006 guys aren't lying and/or cheating too?

Is it "marketing hype"? Not really, it provides and equal basis for comparing two amplifiers which is something the industry didn't have previously. Logically speaking it has nothing to do with marketing as it's simply a test methodology that provides an equal basis for comparison. At some point we could call all amplifier ratings "marketing hype" in some manor.

That said I do understand why some manufacturer's chose not to participate. But I think most consumers don't actually understand the intent of the standard. It was never about marketing or the performance itself, it was about providing a level playing field by standardizing testing methods. Like I said, it's not perfect (really, what standards are?) and I do understand the issues with it, but atleast understand what it's intentions were and were not.

Very informative! We should sticky this as it will be benefical to all.

Thanks for the insight. I always heard it was "Marketing Hype" from so many different people. Didn't know what to think. :rofl2:

:drink40:

Share this post


Link to post
Share on other sites

In order to gain a noticable increase in output you need to gain at least 3dB. 3dB is the smallest a human can notice a change in output.

We can detect changes in amplitude much less than 3db. Under the most commonly cited test conditions we can detect a change in amplitude of 1db.

That said I'm not disagreeing with your premise.....small changes in amplifier power are not worth worrying about.

Is that over all frequencies? Because (personally) I could not tell the difference between 137 dB and 140 dB in the low end range.

Share this post


Link to post
Share on other sites

In order to gain a noticable increase in output you need to gain at least 3dB. 3dB is the smallest a human can notice a change in output.

We can detect changes in amplitude much less than 3db. Under the most commonly cited test conditions we can detect a change in amplitude of 1db.

That said I'm not disagreeing with your premise.....small changes in amplifier power are not worth worrying about.

Is that over all frequencies? Because (personally) I could not tell the difference between 137 dB and 140 dB in the low end range.

He used the word changes, not absolute. ;)

Share this post


Link to post
Share on other sites

In order to gain a noticable increase in output you need to gain at least 3dB. 3dB is the smallest a human can notice a change in output.

We can detect changes in amplitude much less than 3db. Under the most commonly cited test conditions we can detect a change in amplitude of 1db.

That said I'm not disagreeing with your premise.....small changes in amplifier power are not worth worrying about.

Is that over all frequencies? Because (personally) I could not tell the difference between 137 dB and 140 dB in the low end range.

He used the word changes, not absolute. ;)

Interesting, i thought i was going to get eaten up for my math up there :P

Share this post


Link to post
Share on other sites

I know I can tell the difference in 1-2db in high out put bass systems.

The "best" SQ amps are not CEA rated, nor are the "best" SPL amps. Just saying. lol

Share this post


Link to post
Share on other sites

Alright, we'll do a little math here

Say we have a consistent amount of current (amperage) draw at 100 amps and lets Say you were running at 13v.

So to find wattage we multiply Volts*Amps

13v*100a= 1300w * .80 (for a Class D amplifier efficiency rating) which would come out to 1040w.

So on the opposite side lets say you drop down to 12v

12v*100a=1200w * .80 = 960w

In order to gain a noticable increase in output you need to gain at least 3dB. 3dB is the smallest a human can notice a change in output.

and in order to gain the 3dB threshold you must in theory double your wattage.. so is that tiny amount of extra wattage going to matter in a daily driving scenario?

The short answer is no, it would not matter what-so-ever.

SO pick an amplifier that fits your budget, is aesthetically pleasing, and is in your power range.

Granted what I've said isn't 100% perfect, but I hope you get the general idea behind it all.

excellent breakdown, props!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×