I have recently been watching a documentary series on America in the 70’s. Overall, the 70’s were a turbulent time. Change was frequent and people called out the injustices in society. The Women’s Liberation movement gained an enormous following due to the call to pass the Equal Rights Amendment. The Gay Rights Movement gained traction after the Stonewall Riots and saw the election of Harvey Milk, the first openly Gay mayor of San Francisco. The 70’s also saw the development of the Sexual Revolution, which challenged the monogamous societal standard. However, the gaining popularity of these movements also brought about strong opposition. The Republican party began to paint a picture that these new social movement were attacking religious groups. Religion and Politics became inseparable, and people like Anita Bryant understood how to work that new standard. Many of the Evangelicals that I have met like to point to the 70’s as the “end of times” due to the origins of Gay Rights, Women’s Lib, and the Sexual Revolution. However, I wonder how the perception of the these movements would have changed if religious groups were never told to oppose them. Religiosity in America has been declining for a long while, and I wonder if that would be different if religious groups had shown love rather than hate during the 70’s. How would the religious body of America change if religion had never gotten involved in politics?
I personally believe that religion, having never been involved with politics, would not be on a decline. Religion’s affiliation with politics has painted a negative image of religious groups. They are viewed in society and the media as hateful and close-minded. Calling oneself “religious” is almost synonymous with saying “homophobic” or “prudish,” and makes one automatically affiliated with Conservatism. This is an obvious deterrent to those considering joining a religious group. I believe that religions connection with politics is a large contributor to its decline.