answersLogoWhite

0


Best Answer

The U.S. did not win any new territory for itself during or after WW2. The United Nations placed several Pacific islands under US supervision after the war. Shortly after the war, the US fulfilled its promise to make the Philippines independent.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did US win or lose land after World War 2?
Write your answer...
Submit
Still have questions?
magnify glass
imp