No, the alliance won, of which the USA was a part along with France, Russia and Great Britain and her colonies who also made huge contributions in both men and supplies.
America didn't fight in world war 1
America's population at the beginning of world war 1 was considered mobilization. After World War One fifth of the world's population was infected.
world war 2
world 2 war
America contributed to World War 1 before 1917 by supplying the warring countries with military goods. This led to an economic boom that sustained America's war spending when it eventually entered the war.
World War 1
yes
The Allied Powers (U.S.) beat the Central Powers (Germany) during World War One. Hope this helps, Nix
The Allied Powers won both world wars. (Allied Powers: Britain, America, Etc...)
World war 1 was not fought in America... it was fought in Europe, and the Americans were doing what Americans do best, being lazy arses, come in late in the war on the winning side, and then say WE WON THE WAR
When America joined Britain, we broke through the German lines and forced them to sign an armistice. So therefore we won :)
America didn't fight in world war 1
We won
We loss the least amount of men while the other countries loss a lot more
World War I was won by Allied!World War I was not won by a single country; rather, it was won by The Allied forces which comprised of France, UK, Russia, USA and Japan.
Its World War 1 u idiot...
america entered the war after pearl harbor