Changing Role of Religion in American Public LifeName:Institutional affiliationChanging Role of Religion in American Public LifeReligion refer to the belief in or worship of a powerful supernatural being especially a personal God or gods. Christianity has for a long time been the major religion in the United States with the bigger percentage being the Evangelical Protestants followed by the Roman Catholics. Other religions include Islam, the Black Muslim, Black protestant traditional religions, Hinduism, Jewish, Buddhism among many others. The main role of religion was to guide the morals of their followers in accordance with the will of the Supreme Being. However, the role of religion in the American public is drastically changing. This document will highlight some of these changes in the role of religion. Between the 1st and the 19th century, it was common belief that God had delegated his authorities to the secular leaders. There was therefore a narrow line between religion and the state. The Roman Catholic Church enjoyed a huge influence in many European governments including France, Italy and Spain. The Anglican Church on the other hand was the state church of the United Kingdom (Djupe and Gilbert, 2009).