I heard something strange the other day as I was doing some of my produce shopping in the organic section of my local super market... "No not those, they're organic."
Two separate times within a 5 minute period I heard these words come from two separate mothers, as their children proceeded to grab a piece of fruit or vegetable from the organic section.
So I can't help but wonder what the fear is? I realize that organic costs more and that in these economic times its hard for many to see the long-term benefits past the initial increased cost, but this seemed to be about something else... almost a fear of the unknown.
Do people really think organic still means a couple of hippies somewhere out in California cultivating some vegetables on a small piece of grass next to their pot plants?
Organic these days means so much more. It means farmers caring about the food they grow and the people they sell it to. It means farmers caring about the animals they raise, and not just the quality of the meat but the quality of life that the animals receive for as long as they're on this planet. It means farmers caring about the land they farm and doing their best to keep it the way nature intended without the use chemicals and poisons. Its means both farmers and people caring about their community and their quality of life.
Organic means many things to many people and to me it simply means a better healthier way of life... the only way of life!