answersLogoWhite

0

World War II had many effects on the United States. Here are some:

The United States became a major world power both economically and militarily.

Demand for production for war purposes ended the economic depression of the 1930's.

The need for labor during and after the war ended some aspects of religious, racial and sex discrimination.

User Avatar

Wiki User

16y ago

What else can I help you with?