answersLogoWhite

0

Following the attack on Pearl Harbor on December 7, 1941, the United States declared war on Japan. Because Germany and Japan were in an alliance, Germany then declared war on the US. The United States, in turn, declared war on Germany.

User Avatar

Wiki User

10y ago

What else can I help you with?