Yes. The United States remained neutral in the beginning of WW2 as they did at the beginning of WW1.
Chat with our AI personalities
The United States remained neutral in the beginning of WW2 as they did at the beginning of WW1.
The United States was neutral. -APEX Learning®️ 2021
sell arms to a country at war!
United States.!
United States