answersLogoWhite

0

The United States were on the Allies side after Japan attacked Pearl Harbor, Hawaii on December 12,1941. Before the war, the United States was neutral, trying to stay out of the war, hoping to minimize deaths.

User Avatar

Wiki User

19y ago

What else can I help you with?