No. The US entered WWI to support the Europeans.
Germany
After Pearl Harbor we declared war on Japan, in responce Germany declared war on us.
Germany fought the us in world war 1
No Germany declares war on US
In World War Two, the US was supplying the allies in Europe. Imperial Japan attacked Pearl Harbor on December 7th, 1941. The US declared war on Japan. Germany and Japan being allies, Germany declared war on the US as well.
Germany... Germany sent a letter/telegraph to Mexico , which said that if Mexico allied to Germany , Germany would've recovered the territories that Mexico lost in a war with U.S ... U.S. intercepted this message which was the last event that "forced" U.S. into war...
The United States was bombed by the Japanese at Pearl Harbor which cause a declaration of war but Germany had an alliance with Japan and it had forced the United States to declare war on Germany too. Then Germany had an alliance with Italy which cause the United States to declare war on yet another country. Then the world was at war and therefore World War 2 started.
There were many factors which moved the US toward hostilities in World War II, but the causus belli was the Japanese attack on the US fleet at Pearl Harbor on Dec 7th, 1941. The US immediately declared war on Japan, after which Germany (rather inexplicably) declared war on the US.
Germany
Germany
germany
The US declared war on Germany, hour after Germany declared war on the US on December 11, 1941.