answersLogoWhite

0

The U.S. gave Britain weapons through the Lend-Lease program starting in early 1941. These were planes, ships, and guns. Once the United States naval base at Pearl Harbor was bombed, the Americans became involved in the war with troops. American men fought in Europe and in the Pacific. The troops did helped regain British North African colonies. As well as this, Americans as well as British invaded Normandy on D-Day. So, the U.S. was a major element in winning World War II.

User Avatar

Wiki User

13y ago

What else can I help you with?