I was channel surfing today and on one to the cable news channels there was something going on and these folks were debating if the country should start laying down some laws and moving toward organic foods, lifestyles and farms etc. Now I only caught the tail-end, but I thought it was pretty cool. Anyway, so what do you all folks think? Organic or no? I worked, for a brief time, at a Whole Foods Market. Before that I had been seeing a holistic dr. (herbologist) and was much into the all natural/organic life style anyway. However, it was the knowledge I got working at WFM that led me to believe that it would be quite a bit better, not only for us humans, but also out enviroment to go organic. So, that said, I am for it. Mind you organic DOES NOT mean that you can't have all your goodies still. It's actually quite suprising how many healthy sweets (if anyone gets that) that are out there. Discuss.