answersLogoWhite

0

Christianity came to the US from?

User Avatar

Anonymous

15y ago
Updated: 8/18/2019

from all the countries that settled this land from Europe, in those time virtually everyone was christian so everywhere is a good answer

User Avatar

Wiki User

15y ago

What else can I help you with?