they didnt, we declared war on them for attacking us. they "provoked" us to declare war, though we provoked them first by taking away their oil
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
germany
1917
in 1942
Besides the obvious, because Germany declared war on the US hours before.
Germany
December 11, 1941, immediately after Germany declared war on the US.
December 11, 1941. only hours after Germany declared war on the US
Germany declared war on the us after Japan carried out it's part in Pearl Harbour.
Thats what i want to know!