The US did enter into an undeclared shooting war with Germany in the fall of 1941 because Germany has declared war on the US.
Germany fought the us in world war 1
At the start of the US involvement in WW II, the US was attacked by the Japanese. When the US declared that a state of war existed between the US and Japan, Germany declared war on the US. Your question should be was Germany justified in declaring war on the US.
The US did not declare war on Germany in 1941, Germany declared war on the US as a measure of solidarity with Japan after their attack on Pearl Harbor and the Philippines.
In WW 2, Germany declared war on the US.
sinking of lusitania
None. It was Japan's bombing of pearl harbor that forced the us into the war. and as far as the British were concerned, it was about time !
The American press initially referred to the war as a "phony war" because very few military actions were being reported.
Germany declared war on the US, the US then declared war on Germany.
No Germany declares war on US
Germany declared war on the US in the morning, and the US declared war on Germany in the afternoon on December 11, 1941.
Germany. Germany caused Americans to be raged, and want President Woodrow Wilson to take action in the war, but he kept neutrality for a while, but then entered the war.
The US did enter into an undeclared shooting war with Germany in the fall of 1941 because Germany has declared war on the US.
Because they knew that America was going to get involved in the war at that point, and, as they were allied with the Japanese, they knew it wouldn't be on Germany's side.
I don't think that the US declared war on Germany. Germany declared war on the US on December 11th 1941
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
Germany's ally, Japan attacked the US naval base at Pearl Harbor, sinking the USS Arizona.