answersLogoWhite

0

America did not start WW2, Germany and Japan get the credit for that. The US did declare war on Japan on 08 December 1941, after the Japanese bombing of Pearl Harbor, and 11 December 1941 on Germany, being allied with Japan.

User Avatar

Wiki User

12y ago

What else can I help you with?