answersLogoWhite
notificationBell
World War 1
History of the United States

Did America ever lose a war?


Top Answer
User Avatar
Wiki User
Answered 2011-09-12 15:48:18

yes and no we lost Vietnam but it was vietnams war not ours if it was ours we would have been our war we would have been more determined because more would have been at stake unlike losing Vietnam where America wasnt effected by the loss

001
๐Ÿ™
0
๐Ÿคจ
0
๐Ÿ˜ฎ
0
๐Ÿ˜‚
0
User Avatar

Your Answer

Related Questions


America didnt really lose the war. It was never America's war to lose.


yes they did infact dozens.


No one. There has never been a president of America. Nor has America ever been at war with England.



They didn't wont to get involved in war and lose there people


How can a war lose? you can lose a war or someone can win but a war its self can't lose.


actually, we really werne't in the war we just helped out with the good freedom side of vietnam.


The Civil War which was the Northern States vs the Southern States to end slavery. America lost more men in this war than any others combined.


The American's didn't lose the civil war because the North states were fighting the south states over whether there should be slavery in North America or not.


France lost her possessions to North America during the French and Indian War. 1763


To end the war and to collapse Japan's means to make war ever again.


To end the war and to collapse Japan's means to make war ever again.


America is a continent. It, as a unit, has never been to war. Though parts of it have.


To end the war and collapse Japan's means to make war ever again.


The reason why hey lost a colony in America was because of the American Civil War (1861-1865), then England had a war with America and the war ended on the 4th of July, and that is here we get Independence Day from. Independence day is also a film.



they didn't want to lose money or people and resources


the French obviously lost because the only war that the french has ever won is the civil war


yes, while the war was going on in America France and Spain declared war aganst the british in England.


America didn't lose because the war never officially start or end. America achieved the goal of keeping S. Korea free from communism which is why the U.S. went in.


Two: The Confederate States of America lost to the United States of America & America did not win its war against North Vietnam. American officials and some citizens prefer to use the terms, "didn't win the Vietnam War"...as opposed to stating, "lost the war." Technically we were not part of the vietnam, but yes we did lose. Also during the russian revolution we lost.


America didn't lose it. President Carter gave it away.


Probably because the UNITED States of America was fighting with itself.


The French and Indian War was the peace treaty where France lost all its land in North America.


It's scientists. During World War 2 they fled to America, otherwise Germany would be the strongest and greatest country in the world and not America.