The USA did not have to declare war on Germany because Adolf Hitler declared war on the USA.
USA didn't declare war on Germany. Germany declared war on America Dec 11, 1941. But what would of happened if Germany hadn't declared war on America? Would America have come to join the war in Europe? Interesting.
After Germany went against its agreement with Britain and France to not invade Czechoslovakia, both of those countries declared war. After Pearl Harbor, America declared war on Japan, who declared war on America, causing Germany to declare war on America, and America to declare war on Germany.
Why did germany declare war on france?
April 6, 1917
Japan. Then Germany declared war on the US.
Britain warned Germany that they would declare war on Germany if Germany attacked Poland. When Germany invaded Poland the British had no choice but to declare war.
America declared war on Japan. Germany then declared war on America.
Japan, bro. Japan declared war on America not Germany. Canada declared war on Germany on the 10th of September.
Declaring war on the USA after Japan declared war on America. Germany declared war on America hoping Japan would declare war on Russia. Since Japan did not declare war on Russia (Russia would later declare war on Japan), Germany did not have to declare on America. If Germany had not declared war on the USA, America would have difficulty in declaring war on Germany (although USA and Germany naval vessels in the North Atlantic were shooting at one another in 1941). If Germany had not declared war on the USA, then Hitler would not have had to face the military men and equipment of America. Although Russia fought the majority of land battles against Germany, it was American men and equipment that helped Russia and Britain and other Allied nations to defeat Germany.
Germany was attacking merchant ships in the Atlantic. The USA did not declare war on Germany until drawn into war by the Japanese attacks on Pearl Harbour, - attacks on merchant shipping had little to do with declaration of war.