Another post from me today, on a totally different topic than the last one.  I think it’s high time that America as a whole realizes the absolute importance of nature to the health and well-being of both our country and our economy.  If this recent fiasco in the Gulf has shown us anything, it’s that we really don’t take our natural resources all that seriously.  Oh, we take them for granted, but it’s not the same as taking them seriously, by which I mean according them the respect and consideration that we should.  We have come to think of the natural world as something that we can tame, rather than taking into consideration that fact that there are other forces at work in Nature other than us.

I was reading an article in Harper’s the other day that discussed the fact that all too often we in the modern world think that we have complete control of Nature, that through our own ingenuity we have managed to bring the whole natural world under our power.  To me, that is merely hubris, a foolish bit of pride that ends up creating messes like that in the Gulf, or any one of the many mine disasters that have plagued our coal fields.  When will we understand that we disrespect Nature at our own peril, and that doing so is going to create messes that will be very expensive and difficult to clean up?

Now, I’m not saying that we should complete do away with our modern culture.  Even being a hopeless idealist that I am, I still realize we’ve gone too far to ever completely turn the clock back.  Still, I think that a little respect for the grand dame, Mother Nature herself, wouldn’t be impossible.  We should learn to live with Nature rather than trying to oppress it all the time.  I truly think we’d be a happier and more balanced culture, but that’s just me.  What can I say?  I’m an idealist.

Advertisements