answersLogoWhite

0

No, the United States was only attacked, by the Japanese, at Pearl Harbor, which signaled the beginning of the U.S.'s involvement in World War 2. Declaring war on Japan and subsequently declaring war on Germany the following day.

User Avatar

Wiki User

14y ago

What else can I help you with?