because the us was following the nutrality act
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
germany
1917
in 1942
Besides the obvious, because Germany declared war on the US hours before.
Germany
Hitler declared war on the USA on 11 December 1941, during a Reichstag speech in Berlin.
December 11, 1941, immediately after Germany declared war on the US.
December 11, 1941. only hours after Germany declared war on the US
Germany declared war on the us after Japan carried out it's part in Pearl Harbour.