answersLogoWhite

0

No, the U.S. bombed Japan more or less as a result of Pearl Harbor. Relations between the U.S. and Japan were already sour due to the depression and Pearl Harbor pushed the U.S. that little extra bit to bomb Japan.

User Avatar

Wiki User

16y ago

What else can I help you with?