John Cole has written a brilliant post about the negative influence of fundamentalism and conservative Christianity on US society (via). An excerpt (links removed):
But from where I stand these days, the only thing I see religion doing in the public sector is gay bashing and telling women, mostly poor and desperate and in deplorable financial and personal situations, what to do with their bodies. I see busybodies deciding what drugs they can dispense to which customers, or deciding that they don’t have to issue a marriage license because of some petty deity that I don’t believe in told them to hate their fellow citizens and ignore the law. In a country in dire financial straits but still spending billions and billions of dollars on education, I see religious folks actively and openly working to make our schoolkids dumber. I see them shooting people who provided a medical procedure, and I see others rummaging through people’s personal lives to find out who hasn’t lived up the word of God. I see glassy-eyed fools running for President claiming that vaccines that save lives actually cause cancer, or that if you get raped and are pregnant, you should just lie back and think of Jeebus and make the best of a bad situation. In fact, everywhere you look these days, if Christianity or religion is getting a mention, it means something ugly is happening and someone somewhere is being victimized, marginalized, or otherwise abused. Go read some of the arguments against integration and you’ll see the same bible verses used today against homosexuals. Fifty years from now, they’ll be recycling them again to trash someone else they don’t like or who isn’t good enough for them.