DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
Germany didn't declare war in 1915 against anyone, it was actually Britain that declared war on Germany
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
The USA did not have to declare war on Germany because Adolf Hitler declared war on the USA.
The Americans
declare war on Germany
The English did not declare war on Germany, Great Britaindeclared war on Germany because Germany had invaded other European countries with which Great Britain had a mutual defense agreement.
in 1914
no
germany
To response to Germany invasion of Belgium.
Britain was the first to declare war on Germany and New Zealand and the rest of the commonwealth felt like they wanted to help britain defeat germany