answersLogoWhite

0

The U.S. Civil War effectively ended slavery in the U.S. and it brought the nation closer together. Before the war the average citizen would say, "The United States are" instead of, "The United States is" They were implying that the U.S. wasn't one as a nation.

User Avatar

Wiki User

14y ago

What else can I help you with?