Class A is the least efficient amplifier design. They are theoretically limited to a maximum efficiency of 25% (remember, that's maximum). They also are never "off", which means they are pulling full amperage from the electrical system even at idle, with no signal input. Class A/B is more efficient than class A. At full power, they can be in the 60% efficient range. However their efficiency worsens as power output decreases. At 1/3 power they are only about 30-35% efficient. And guess where they amplifier spends most of it's time? Here's a hint; It's not towards the full power end of the spectrum. Class D are typically more efficient than Class A/B. Class A/B and class D efficiency can be similar at full power output, however class D don't suffer from the loss in efficiency at lower power levels that class A/B suffer from. As to the real "question" that you posed here; An amplifier should be designed to function properly within the temperature range it will be operating in. If you have an amplifier shut down on a shorter time scale due to over heating when used within it's intended operating range and given adequate ventilation, then it's heat dissipation properties was given poor consideration by the designer and that would qualify as a poorly designed amplifier.