answersLogoWhite

0

It was actually Japan that brought the U.S into war. After Japan bombed Pearl Harbor, they got very upset and declared war on Japan.

User Avatar

Wiki User

13y ago

What else can I help you with?