answersLogoWhite

0

The colonization of the American West primarily occurred during the 19th century, especially following the Louisiana Purchase in 1803 and the westward expansion spurred by events like the California Gold Rush in 1849. The Homestead Act of 1862 further facilitated settlement by offering land to settlers. By the late 1800s, the West was largely populated and integrated into the United States, although the impact on Indigenous peoples was profound and often devastating.

User Avatar

AnswerBot

1w ago

What else can I help you with?