answersLogoWhite

0


Best Answer

it help during the time that cost lost of people there lifes

User Avatar

Wiki User

βˆ™ 11y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

βˆ™ 9y ago

As Shebly Foote once said .... It defined us a nation. both good and bad. {paraphrased}

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How did the Civil War impact the United States?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What was Abraham Lincoln's impact on the world?

He Unified The United States After The Civil War And Abolished Slavery


When was The Civil War in the United States created?

The Civil War in the United States was created in 1937.


What did the us do in the civil war?

The Civil War was a war within the United States. This war was fought between 1861 and 1865 between the north and south of the United States.


when was the united states civil war .Β ?

the civil war to place inΒ 1861 – 1865


When will there be a civil war in the US?

There is no chance that there will ever be a civil war in the United States.


Who were the us enemy in Civil War?

The United States of America fought the Confederate States of America in the Civil war.


Was the war of 1812 part of the Civil War?

No. The war of 1812 was between the United States and England. The Civil War was between the North and South of the United States. It occurred between 1861-1865.


What coutry had a Civil War?

United States of America


What contienets where in the Civil War?

we need to know which civil war you are talking about The United States Civil War was on the continent of North America


How did the railroad affect the post civil war expansion of the united states?

After the United States Civil War, railroads penetrated the vast majority of the country's regions and companies.


What impact did the civil war put on us today?

The Civil War was a huge change in American history. If the North hadn't won the Civil War or the Civil War never occurred, blacks would probably still be fighting for equal rights! The Civil War was one of the major wars that occurred and changed the United States of America...


Turning point of civil war?

The Civil War was a turning point in the history of the United States. Before the Civil War, the country was divided in its attitudes, exemplified by saying "the United States are;" after the war, people said, "the United States is," showing a unity that had not previously existed.