Back in my early playing days,'60s '70s the general consensus was that a 50 watt Marshall head was louder than a 100 watt Fender Showman,when I would ask why to my guitarist friends they would say that the Marshall was rated by British watts which were different than standard US measures,was there any truth to that ?
The Watt was name after James Watt, a Scotsman. Some people say that amps operated on a 50Hz line voltage sound better than ones operated at 60Hz. There could be something in that.
Considering that all amps rectify line power to DC for internal use, that's likely just another 'gaseous emission' as well . . .
I have an early silverface Showman Reverb. It's basically an 85W amp as were earlier blackface Showman amps. I also have a JMP Lead 2203 running 6550s and JCM 800 Superlead 1959 running EL34s. I believe Marshalls will run about 90W clean with EL34s and a bit more with 6550s. I haven't compared the volume of these amps directly but all three will take your head off. A couple of observations that may be relevant. First, I don't believe the speakers Fender used where as efficient as the typical Marshall 412. Second, the input gains are setup to come on really fast with the Marshalls. It's pretty much impossible to dial them in to play at low volume, but this has nothing to do with how much power they make or how loud they play, it has to due with the calibration of the controls. In contrast the Fender is fairly well behaved at low volume, but wank on the controls and it gets plenty load, especially driving a pair of JBLs.
You may be missing something subtle. Different transformer taps are typically used because typically 60hz is 120V and 50hz is 220-240V or 100V. Another factor is transformers are more efficient with 60hz than 50hz, so the voltages will run a bit high on 60hz. I am not certain, but I suspect the supply may be a bit stiffer on 60hz as a result. So amps designed to run on 50hz may actually sound better, or at least different on 50hz. The power supply voltages of my Matamp GT120 and JMI Era Vox AC30 run considerably over spec on US voltage. One explanation I have read is that the engineers calculated the turns ratio properly, but did not consider the extra efficiency from the transformers running on 60hz versus 50.
DC would would power just nothing on a power amp stage. There is some natural AC ripple neccessary to charge the filter caps. On 50Hz line supply the number of charging cycles per second equals 100 for full bridge rectifier versus 120Hz at 60Hz line AC.
A Showman driving a single 2x15 cabinet versus Marshall driving two 4x12 cabinets and the extra cone are comes in to play as well.
Largely irrelevant . . . it's more the *current* that is cyclical, since the filters (if correctly sized) negate the ripple, and subsequent filter caps typically leave it unmeasureable *in a well designed power supply*. Were you to replace the supply with a battery of the same voltage, the amp would run just fine . . .
Without voltage ripple there was no current charged through the caps. Without no voltage ripple there was no energy/power transfered along the "DC" supply. the basic math clearly states that current transfered along a cap follows the expression of I = C*(dV/dt) edit, supply caps are meant to charge and also discharge power/energy. That's the reason why some voltage ripple along the caps is very important for propper working of the caps.