I've read a lot of posts about, just as an example, old Ampegs (SVT or V-4) with no Master volume versus newer ones that do have a Master Volume. And I've read a lot of posts about "tube breakup" and "grit" and "dirt" and "tubeyness" and "clean". I THINK I understand what all this means and then every now and then I'll read a post that doesn't make sense to me and calls everything I thought into question. So, here's how I think all this stuff works. Can somebody please tell me what parts are wrong? Some amps have an input gain adjustment and a master volume (like a new SVT CL) and some only have a channel volume (like an original SVT). On the ones with only a channel volume, that volume is actually adjusting the input gain and the "master volume" is built into the amp internals and runs wide open. When people talk about "dirt" or "grit", technically they are really talking about distortion. And in an all-tube amp (or any amp, really), distortion can come from 2 places - the preamp and the power amp. So, on a non-Master Volume amp, you have no control over how much distortion you get. When you set your volume level (i.e. the amount of input gain), you will get however much distortion you get from the preamp and the poweramp. If you want more or less, your only options are to use a hotter or less-hot bass or different cabs that have a different amount of sensitivity. In contrast, a Gain/Master amp will give you some control over the amount of distortion at a given volume. If you want more distortion, you turn up the Gain and turn down the Master. But, this only gives you distortion that comes from the preamp. Distortion from the power amp is still only going to be increased by turning up the actual volume (not just turning up the volume knob). If you turn up the volume knob, but turn down the gain knob to maintain a certain volume, you will decrease the preamp distortion but you will not increase the power amp distortion because the output power is directly related to the actual output volume, so if you keep the volume the same the output power is the same, so the power amp distortion is the same. If an amp has different input sensitivities, then you can use those to increase or decrease distortion, to a degree. The Ampeg V-4 has a 3-way switch for Input Sensitivity that is 0dB, -6dB, or -9dB. If you have it set for -9dB, and you have the volume maxed, and you want more distortion, you could get it by changing the input sensitivity to -6 or 0. But, the utility there is limited because, if you want a lower volume level, increasing the input sensitivity will increase the volume, so you'll have to turn the volume knob down some and end up with the same distortion as before. Does all that make sense? If so, here are some questions that I hope somebody can enlighten me on: Preamp distortion generally will sound a bit different than power amp distortion, right? And when people are talking about messing around their settings to get more or less distortion they're generally just getting more or less preamp distortion, right? Because the power amp distortion is going to only come from running the amp wide open or nearly so. Anything less and the power amp will be running clean with no clipping (i.e. distortion), right? When people talk about tube "breakup", that generally is referring to power amp distortion, right? Or is that a general term that would also apply to preamp distortion? Can you characterize the difference in the sounds of preamp distortion versus power amp distortion? If power amp distortion is particularly desirable, why don't (more) tube amps come with a switch that lets you control how much power they're putting out? Either by controlling how many power tubes are in use or, like the old MusicMan HDs did it (I think) by reducing the plate voltage to the power amp tubes? That way, if you want lower volume, but also power amp distortion, you could switch it down to a lower power setting and get what you want. And if you read all that.. thank you! Double thanks if you can shed some light on the subject for me.