Let me start by saying that I'm not an expert, anything and everything in this post should be considered incorrect and I'm asking a question about something that I recently heard that I've been thinking about and have been trying to research but I don't have the background to say there's something there or shoot it down. That said, what I heard was that the peak power rating on a class D amp is more applicable as a rating than RMS than on a class AB amp because of the faster power ramping on a class D amp. I don't have the engineering background to understand fully and intimately what the ratings mean. Without that, at face value it maybe makes sense that if my (well designed) class D amp can reliably hit peak power, then maybe that becomes a better rating for the amp than RMS? I hit this looking at the rating for Trickfish Bullhead 1K which isn't listed as RMS and I'm trying to figure out what the real rating for the amp is. And then what do I really mean by "real rating". Nobody from Trickfish made the claim of peak vs RMS, don't want to start that rumor.
the user manual states it's peak. we can only guess, but it's generally possible that rms rating is 1/4 of the peak power.
Many of the class D amps I've seen tested made roughly the same RMS as peak power. This is in contrast to linear supply AB amps that can have significant burst power headroom (e.g. NAD). However designs and implementations vary, so I'm not sure if this is generally true.
RMS versus peak power has nothing to do with the class of the amp and everything to do with the choices of the marketing department.
Yeah, I need to research and really understand what they mean. I kinda know, but that's where the danger lies. Heck, I just recently understood balanced versus unbalanced, what 14" balanced means and why XLR to 1/4" adapters might be what you need on a particular gig but why they're not the best choice long term. Sometimes Youtube is worthwhile.
Granted my info could be outdated, but... there was a time when amplifiers were subject to the FTC "amplifier rule," which defined RMS, not as a law of physics or mathematics, but as a method of measuring amplifier power. Using the "RMS" terminology implied that your amplifier would satisfy its power rating when tested according to that method. The amplifier rule also implied that the power rating should be accompanied with a rating for harmonic distortion. Naturally, RMS also has a technical meaning, and it works out that the average power of a perfect sinewave source driving a resistive load will be half of the instanteous power at the peaks of the waveform. That's where the 1/2 comes from. Same test, same waveform. What "RMS" means to me, if cited by a reputable amp maker, is that they're willing to adhere to a standard that they didn't come up with on a whim after discovering that the amp doesn't meet the marketing specs. If an amp rating doesn't say "RMS", then it's whatever you want it to be. I've done my own amp testing, using both a dummy load and an oscilloscope. Class-AB and Class-D. I haven't seen a reason to expect those amps to behave differently in ways that would justify the use of different rating methods. If you're pushing an amp hard enough that its ratings matter, which I typically don't, of greater importance is how the amp maker has tailored the design to handle overload signals, which is up to the skill and taste of the designer regardless of output class.
That is not my understanding but hopefully someone with an electrical background will weigh in. If you're correct, I'll stand corrected. The following is a spec quote from the old Acoustic Control Corp: "Model 361-Bass Amplifier”, Acoustic opened the specifications with “440 Watts Peak, 200 Watts RMS” - a ratio that sounds more like it. By the time they brought out the Model 371 the brochure claimed “730 Watts Peak, 365 Watts RMS” back to confusing numbers again, RMS ratings are never exactly half of peak-to-peak, so should we blame the marketing department? " So, in this case, the Peak power rating appears just a bit more than double the RMS rating. Sounds like marketing had a hand in the ratings but I don't see evidence of Peak being 4 times as high as RMS rating.
What I’ve noticed is a different ratio of output ratings vs. ohm loads. i.e. if my Class AB amp was rated for 600 watts at 4 ohms, the 8 ohm rating was 2/3 of that: 400 watts. With Class D, I see a very consistent relationship between impedance and output: when impedance doubles, power is half. Every 800 watt Class D amp I’ve owned was rated for 400 watts at 8 ohms.
For amps, peak is 2x the RMS number. This is purely a math conversion, 2 different metrics for exactly the same waveform. Some manufacturer’s take liberties and combine burst power with peak power to come up with larger numbers. Program/peak power combinations are different, they apply to speakers and not amps. The program number is based of the definition of using a crest factor of 6dB for the representation of a sine wave at 50% power duty cycle. Peak in this context to twice the program power number. The constant 2x difference is due to using a PWM regulated power supply, this applies to any amp using this type of power supply, not exclusively a class D thing.
Not true for me. Every solid state amp I’ve ever had, the power has never halved. 200 @ 4 = 150 @ 8 500 @ 4 = 350 @ 8 450 @ 4 = 260 @ 8 800 @ 4 = 450 @ 8 1000 @ 4 = 650 @ 8 And power ratings usually decrease when pushing 2ohm loads.
This is sometimes but not always true. For example, in the entire Subway series (which use solid state power amps), the 4 ohm rating is 2X the 8 ohm rating. In the case of the 800 watt models, the 2 ohm rating is the same as the 4 ohm rating. In all other amps I have designed that support 2 ohms), the 2 ohm rating is always greater than the 4 ohm rating.
Not to quibble (which always means the opposite of course), but we're talking a fraction of a dB here, compared to an exact 2:1 ratio. A much bigger source of variability is hidden in the distortion level chosen for making the RMS measurement.
Some of the early amplifier test systems measured all the output from an amplifier to measure voltage and current to derive power. Problem was that Class-D has "higher than human hearing" carrier wave that can leak through to the output. Test systems were not filtering this HF out, and some class-d got false higher ratings. Everybody now knows to filter the super HF out. The "Class-D aren't as loud" phrase may go back to the days when some Class-D amps were measured wrong. Internet keeps echoing it. It's not true. If you measure voltage at your 120V outlet, a meter may say 170V - that's because the meter is reading Peak Voltage. If you use a "True RMS" meter, then the voltage reads 120V Using Ohms law RMS: 120W @ 8 Ohms = 1800W Peak: 170W @ 8 Ohms = 3612W This is why people say "Watts Peak" is 2x "Watts RMS" "Music Power" and others ratings are just made up to publish a higher number to look good in an ad.
The only amp I owned/own that has been bench tested for power is my Aragon 2004 (Class A/B solidstate amp). An old friend of mine is an engineer who has designed and built many amps over the decades (Great American Sound, Harman Kardon/JBL, Precision Innovative Electronics, and others); we tested the 2004 at his place. The Aragon is rated at 100 watts RMS @ 8Ω and 200 watts RMS @ 4Ω. IIRC, we measured about 110 watts RMS @ 8Ω, a bit over 200 watts RMS @ 4Ω, and the bench fuse popped at around 300 watts RMS @ 2Ω, all with both channels driven.
LOL, the switching hash from an early Class-D amplifier smoked the audio input on my laptop computer. I thought I was being careful by turning everything down, and was just going to sneak up the volume a tiny bit to measure the response curve. I powered up the amp, and actually saw smoke coming out one of the speaker grills of the computer. Amazingly the whole computer except for its audio hardware still worked fine, and I always use an outboard audio adapter today when I'm doing testing. I also don't test power amps any more.