answersLogoWhite

0

The Civil War was a turning point in the history of the United States. Before the Civil War, the country was divided in its attitudes, exemplified by saying "the United States are;" after the war, people said, "the United States is," showing a unity that had not previously existed.

User Avatar

Wiki User

12y ago

What else can I help you with?