answersLogoWhite

0

A:The United States proudly began as a nation in which religion and state were held to be separate and where a person could be respected whether or not he held any religious beliefs. Since around the time of World War I, the United States has become a more religious society, at least in the narrow senses of formal religion and of imposing standards and beliefs onto society in general. A very high proportion of the population now claims to hold religious beliefs, mainly Christian or Jewish, with a growing Muslim presence. Whereas the United States had previously been relatively open to accepting the science of Charles Darwin's Theory of Evolution by Natural Selection, a majority of Christians closed ranks behind a biblical account of creation. Those considered by many to be religious fundamentalists seek to impose their views on marriage, Birth Control and abortion. Thus religion has become a pillar of American society, although this seems to be moderating in recent years.

In other parts of the Western world, the process is very much the reverse. In some countries, evolution is accepted largely without comment as the way the world began. Birth control and abortion are widely considered to be rights that belong outside the domain of religious debate. Even religious belief is falling, so that the churches can no longer plausibly claim to represent the people as a whole. Some European countries report that fewer than half the population hold religious beliefs. This indicates that religion is no longer a pillar of society in many Western countries.

User Avatar

Wiki User

12y ago

What else can I help you with?