DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
Germany and Italy declared war on the US on Dec. 11, 1941 after the US declared war on Japan because of the bombing of Pearl Harbor on the 7th. The United States never did declare war on Germany/Italy.
Because the United States had declared war on Japan, Germany's ally.
The US did not declare war on Germany. Germany declared war on the US, on December 11, 1941. This was four days after the Japanese attacked Pearl Harbor, and three days after the US declared war on Japan. Shortly after, Italy also declared war on the US. The procedure is the president asks Congress for a Declaration of War, and Congress then votes on the question.
Initially, the US was to remain neutral in World War I but in 1917 Germany attempted to form and alliance with Mexico. Also, Germany began sinking US ships. These events led the US to declare war on April 6, 1917.
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
Unlike Vietnam & the US Civil War, which were undeclared wars; Germany & the US declared war upon each other on the same date, 11 December 1941. Germany in the morning, the US in the afternoon.
If you mean in WWII then this is what I have got. Germany did not actually declare war on the Americans. When the Japanese bombed Perl Harbour in America the Americans wanted to declare war on the Japanese but the British persuaded the Americans that it was the Germans that started it and so the Americans declared war on the Germans and any other army with them including the Japanese on 11th December 1941. Hope that helps! =)
germany
1917
in 1942
Besides the obvious, because Germany declared war on the US hours before.
Germany
December 11, 1941, immediately after Germany declared war on the US.
Germany declared war on the us after Japan carried out it's part in Pearl Harbour.
Thats what i want to know!