answersLogoWhite

0

Japan has always been independent. Unlike many other countries in Asia, Japan was never taken as a colony by a Western power. The only small footnote is that Japan was occupied by the U.S. following World War II (from 1945 to 1951), but that is different from colonization.

User Avatar

Wiki User

9y ago

What else can I help you with?