This isn't going to be all that political. I've thought about it long and hard being part of a major corperation, Wal*Mart, a small part but a part none the less. The customer is always right is destroying America. Back in the day, if a customer was wrong you could say "You sir are wrong" and all would be fine, you might lose a customer but they hopefully learn they are wrong. In my job at Wal*Mart, which you may have read about in my vent thread about it, I push carts and do other odd jobs. Customers leave carts everywhere! The reasoning for our designated cart areas is to protect your cars that you love so dearly, a tiny scratch would just destroy it's value, blah blah blah. People still don't put carts in the areas or inside and then bitch when a cart hits their car. How could this be prevented? Scold customers who put carts where they don't belong. Say, "HEY, that doesn't belong there." But this isn't just for my personal benifit. One reason America is becoming so fat and lazy is so many things are being handed to them, and no matter what they do, they are never wrong. If you don't get it your way, cry like a baby you'll get it. Leave a mess, someone else will clean it up, I'm sure they have nothing better to do. I'm not just saying this as a disgruntled employee, the concept of anyone always being right is disgusting. I feel if this business method had never come about, it wouldn't be such a big deal to tell John Q Customer he's wrong, he might even not realize he/she is wrong. Any thoughts on the matter?