answersLogoWhite

0

After ww1 Japan gained control of?

Updated: 8/18/2019
User Avatar

Wiki User

13y ago

Best Answer

Japan didn't gain control of anyone if anything they lost land.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: After ww1 Japan gained control of?
Write your answer...
Submit
Still have questions?
magnify glass
imp