answersLogoWhite

0

What brought US into World War 11?

User Avatar

Anonymous

16y ago
Updated: 8/18/2019

Technically the US was helping with the war in Europe but they did not declare war on Germany and Japan until after the Japanese bombed Pearl Harbor in Hawaii on December 7th, 1941.

User Avatar

Wiki User

16y ago

What else can I help you with?