answersLogoWhite

0

Did pearl harbor change anything

User Avatar

Anonymous

15y ago
Updated: 8/17/2019

Pearl Harbor brought a dramatic change on the United States views on World War II. The United States wanted to maintain neutral and prevent entering war, but once the Japanese bombed the United States at Pearl Harbor December 7, 1941, the United States changed their mindset and decalred war.

User Avatar

Wiki User

15y ago

What else can I help you with?