Yes
Yes.
Germany.
We entered the war in 1941 after the attack on Pearl Harbor. And then after we declared war on Japan, Germany and Italy declared war on us, officially starting WWII.
The United States Congress officially declared war on Germany on April 6, 1917. President Woodrow Wilson asked for a declaration on April 2.
Have you ever heard of Hitler?
if you mean apology, why would one be needed it was war.
Germany declared war on the US, the US then declared war on Germany.
No Germany declares war on US
It was when Japan attacked Pearl Harbour without warning. America declared war on Japan. A couple of days later, Adolf Hitler and Nazis Germany declared war on America. America officially declared war on Japan and Germany and was able to come to Britain and her Allies in our hour of need, not only in Europe, but in Asia as well.
Britain and the US had been allies for quite a while by WWII. The US declaired neutraility in the begining of the War, but after the Japanese bombing of Pearl Harbor, The United States declared war. The date The United States officially delcaired war on Japan was December 7, 1941 and officially with Germany and Italy was December 11, 1941, thus entering the Allied Forces alongside of the UK and the USSR. Britain and the US had been allies for quite a while by WWII. The US declaired neutraility in the begining of the War, but after the Japanese bombing of Pearl Harbor, The United States declared war. The date The United States officially delcaired war on Japan was December 7, 1941 and officially with Germany and Italy was December 11, 1941, thus entering the Allied Forces alongside of the UK and the USSR.
Germany declared war on the US in the morning, and the US declared war on Germany in the afternoon on December 11, 1941.
yup!