I noticed how different manufacturers rate the output of their amps at a different THD level (Example-- 1% THD for a Carvin R600, and <0.1% for an Eden or SWR). How much this affect what the maximum clean output of the amp really is?

Well it`s hard to say unless you can do comparative measurements. Every company has its own way of measuring, even when they seem to be comparable on paper (same test frequency, same THD). You can only "hope" that the measurements hold up in everyday use. As a rule of thumb, the measurements of "cheaper" brands and models tend to be more optimistic than on higher range amps and makes, which are usually pretty honest.

I've mentioned my views on Carvin's power ratings in quite a few past posts. You also need to consider that some of their ratings are done at 1 kHz, which is not the same as a broadband test over the whole range. I don't know if they drive both channels simultaneously when doing the measurement, either. IMO, Carvin's ratings are on the liberal side. I generally regard their *clean*, full-spectrum output as maybe 80% of the claimed wattage figures. They still give plenty of watts per dollar, which is one of several reasons I like their amps. - Mike

Besides, you are never gonna use the amp with a continuous 1 kHz sine wave at full power, so why bother even bother looking at those measurements methods? They only specify max power in a lab. Play on it, see how loud it is, and if it stays clean. That's the only way, ultimately. Or just believe their measured specs and accept when they're 10% off in your real life situation. 10% of power loss is only half a dB.

Ok, let's say I have two amps. Both put out 95 watts at less than 0.01% total harminic distortion (THD). When pushed up to 100 watts, amp A rises to 1% THD while amp B rises only to 0.1% THD. Then maybe amp A puts out 110 watts at 5% THD while amp B rises to only 0.5% THD at 110 watts, hitting 1% THD at 120 watts, etc. OK, so both companies advertise the amp as 100 watts with the respective THD. What will this SOUND like? Well it means that when driving to maximum power, amp A will be driven into clipping a bit faster and distortion will be more noticeable. Run them both at moderate levels where the peaks seldom require the full 100 watts and you may not hear any difference at all. If you run them flat out all th etime, amp A might sound a bit dirtier, but perhaps not. Here's why: amp B can put out 120 watts before distorting as much as amp B at 100 watts. This is a difference of only 0.7 dB!!! Essentially inaudible, so amp B is not going to sound any louder or cleaner when driven into clipping. Some companies do play games with ratings. 100 watts into 8 ohms at 1 Khz at onset of clipping is not the same as 100 watts into 8 ohms from 20-20,000 Hz +/- 1 dB with less than 0.1% THD. Ashdown has some amps with wattage ratings like 305 and 585 watts!!! Why not just say 300 or 600? The difference between 300 and 305 watts is only 0.07 dB (!!!), 585 to 600 watts only 0.1 db. Even today people argue about 35 year old Fender Bassman heads, 40 watts, 45 watts or 50 watts? Depends on how much distortion you're willing to accept and at what frequencies you're measuring. Some (not many) amps are very conservatively rated. Guitar Player magazine once bench tested a Walter Woods MI-100-8 head rated 100 watts into 8 ohms. At the onset of clipping, they measured 150 watts into 8 ohms and 200 watts into a 4 ohm load!

Brian makes some really good points. Yes - in terms of what is a material difference in power, it is a valid point. You're really getting at what is a "significant figure" here. It would probably be better to rate amps' power output in terms of dB relative to 1 watt. For example, a 600-watt amp would be 27.8 dBW; a 500-watt amp would be 27.0 dBW. A 50-watt amp would be 17.0 dBW - only 10 dB quieter! That would give everyone a much better feel for the amps' audible differences. And it would also help people understand headroom better, I would think. - Mike

No!!! Decibels bad, hard to grasp. Besides, the numbers are much lower than the watt numbers Not good for advertising! Examples: A good while back I was discussing a speaker that had a 289 dB sensitivity. An amp gets it up to 100 dB at 100 watts. So a 200 watt amp will be 200 dB?

The THD quoted in a power spec is just an arbitrary definition of the onset of clipping. Clipping starts at the peaks of the signal waveform, so it's a gradual thing and not easily defined except to just pick a point at which the distortion of a sine wave reaches a certain level. Therefore, if a manufacturer decides that the onset of clipping is 0.5% THD, they'l increase the sine wave level until the waveform distorts enough to reach 0.5%. They measure the RMS voltage and calculate the power into the load. A manufacturer could choose something much lower, like 0.0001% THD to define the onset of clipping. But they'd have to have a sine wave generator much cleaner than that and have a distortion analyzer with that sort of resolution and a noise floor low enough to not interfere with the measurements. That's an extremely tall order; the resulting tests would not be as reliable. Alternately, a manufacturer could choose a ridiculously high distortion point, like 10% THD. That's for playing specmanship on the power numbers.

Actually, I think they're easy to grasp once someone understands SPL. And we don't *have* to use decibels. How about millibels? So a 50-watt amp, referenced to 1 watt, would be a 1700 mB amp!!!! Damn, that has to represent incredible directivity! And no doubt from a converging-diverging nozzle like they use on the space shuttle engines. It must be an acoustic lens with a focal length of one meter. I'm saying just forget about watts from the user's perspective. From now on everything is expressed in decibels, centibels, millibels, microbels, nanobels, picobels, femtobels, ringdembels.... - Mike

Yes, but just barely. This makes me think I'm being "kind" in my estimates. But I don't have the proper test equipment right now to confirm anything, so... - Mike