answersLogoWhite

0

Why did Americans expand west?

Updated: 8/22/2023
User Avatar

Wiki User

13y ago

Best Answer

No other direction was available. British held Canada was to the north and resisted American incursions during the Revolutionary War and the War of 1812. The Atlantic Ocean was to the east and south. Travel to the moon as well as building large underground cities is not feasible even today That left west as the only available option. Even the large chunks of Mexico taken in a few wars were to the west.

The original contributor has a great sense of humor. As noted by trips to the moon & underground cities. Also the Atlantic Ocean was amusing as well. The answer as to why America expanded to the West had little to do with Canada & the British.

But yes the land acquired as a result of the Mexican War, along with the Louisiana Purchase, greatly increased the size of American territory. A few reasons as to why, also involve the California Gold Rush and the Transcontinental railway which linked the Eastern part of the USA to the West coast. The Federal government encouraged the railroads with generous land lease deals to increase the entire USA railway system. This helped improve access to the western plains and the west coast. Previously wagon trains were the best and slowest way to reach new areas and form new communities. With help from the US Army, settlers breached American Native treaties and the rich and fertile western lands were available.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

With the increase in population of America due to Immigration, jobs in the east became more difficult to find, free farm land was available to the west, and the American culture was based around expansion so it became common to migrate west.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why did Americans expand west?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What did Manifest Destiny mean?

It was the belief most Americans had at the time that they wanted to expand the country from the east coast to the west coast


Did the expeditions of Lewis and Clark and of Zebulon pike expand Americans knowledge of the land west of the Missouri river?

Yes :D


What manifest destiny mean?

It was the belief most Americans had at the time that they wanted to expand the country from the east coast to the west coast


What about Manifest Destiny was?

The sense that Americans were destined to spread west to the Pacific Ocean


What was manifest destiny why did it make Americans move west?

People traveled West during Manifest Destiny to settle the previously unknown areas of the country. It was the goal of the President Andrew Jackson to expand the freedoms known to the states on the East Coast throughout the rest of the nation.


Why did the people expand west?

people expanded west because they thought it was gods decision for those who lived east to move westward expanding to the pacific ocean. Many groups moved West some of which, the Common Man, African Americans,Natives, Mexican (chicanos) e.t.c


What does The Missouri Compromise have to do with the west?

it helped the U.S. expand west in the middle of the manifest destiny


How did Jefferson expand the US?

purchasing land west of the Mississippi river and forcing native Americans to move west


What evidence was there that Americans might expand into British North America?

Evidence that proves that the Americans might expand into British Norh America was the fact that after the civil war, the Americans moved westward, following their belief of Manifest Destiny. The Americans conqured lands controlled by Mexico, spain, France, and Great Britain. The British North American colonies were afraid that the Americans would take over their western lands and that they would invade again, as they did during the American Revolution in 1812. There was a possibilty that the americans would have a majority to be able to annex the territory if they migrate into the North-West of Britsh North America.


Why did some Americans want to take control of islands in Pacific?

to expand


Did Americans have the right to expand across the U.S?

yes it was their american right


Why did settlers move west?

Many Americans bought into the idea of "Manifest Destiny"-- the belief that Americans were supposed to expand the country and acquire new territory. While this sounded good in theory, it often meant taking land away from the people who were already living there-- usually, the Indians (today called Native Americans). Other Americans moved west because it was there-- they liked the adventure and challenge of establishing a new town or city, and they were excited to be the first people in that place.