answersLogoWhite

0

Yes, the U.S. did claim to be neutral. We were not involved in the war other than making money off of it. Our factories made tons of money off of supplying weapons for the war. Right before Pearl Harbor we did however stop selling to Germany because they had been disobeying the treaty of Versi. Which had been signed at the end of WW1.

User Avatar

Wiki User

13y ago

What else can I help you with?