answersLogoWhite

0

Did America won World War 1?

Updated: 9/18/2023
User Avatar

Wiki User

13y ago

Best Answer

No, the alliance won, of which the USA was a part along with France, Russia and Great Britain and her colonies who also made huge contributions in both men and supplies.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did America won World War 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Women won the right to vote after America victory in?

World War 1


Is it true America lost Battles but Won World War 1?

yes


Who won world war 1 America or Germany?

The Allied Powers (U.S.) beat the Central Powers (Germany) during World War One. Hope this helps, Nix


What resulted from both World War 1 and World War 1I?

The Allied Powers won both world wars. (Allied Powers: Britain, America, Etc...)


What was the US doing when World War I had not been fought in America?

World war 1 was not fought in America... it was fought in Europe, and the Americans were doing what Americans do best, being lazy arses, come in late in the war on the winning side, and then say WE WON THE WAR


Who is side won in world war 1?

When America joined Britain, we broke through the German lines and forced them to sign an armistice. So therefore we won :)


When did America win World War I?

America didn't fight in world war 1


How did the allies win World War 1 and World War 2?

We won


How did America won in world war 1?

We loss the least amount of men while the other countries loss a lot more


Which nation won World War 1?

World War I was won by Allied!World War I was not won by a single country; rather, it was won by The Allied forces which comprised of France, UK, Russia, USA and Japan.


What pushed America to the war world 1?

Its World War 1 u idiot...


How was America's approach to the world different after World War 2 from is approach to the world after World War 1?

america entered the war after pearl harbor