answersLogoWhite

0

At first it was a major part of America as most of our founding fathers were deists. However as the years passed Christianity started to become more prominent.

User Avatar

Wiki User

14y ago

What else can I help you with?