civil war i think. nope it was the Revolutionary War " the british are coming"
USA became a free country
Well, it separated the United States from the rule of Great Britain. That's why they call it "Independence Day".
The US Civil War made the country weaker; it created racial and geographical resentments that linger to this day.
The US is a mostly free country, a few states are exceptions
WWII was the last declared war fought by the US.
It really made us proud of our country and make us want to help the war effort.
the civil war made us an anti-slavery country It also preserved the union, that is, the seceded states were restored to statehood.
There was no US declaration of war during the US Civil War (Amerian Civil War). No US declaration of war for any wars after December 1941.
They fought for our country, making our country free. If you know a person working for the war or something like that, please, thank them for there services.
The Constitution made the states were free and that people had a choice in what happens in the country.
If we are talking about World War One and the side of the US, Japan made the first attack, launching the country into the war. It was they error as it brought total destruction for the Japanese.
The US is NOT at war with any specific country (nation). The US is hunting down terrorists.