answersLogoWhite

0

What brought US into World War 11?

User Avatar

Anonymous

15y ago
Updated: 8/18/2019

Technically the US was helping with the war in Europe but they did not declare war on Germany and Japan until after the Japanese bombed Pearl Harbor in Hawaii on December 7th, 1941.

User Avatar

Wiki User

15y ago

What else can I help you with?