No other direction was available. British held Canada was to the north and resisted American incursions during the Revolutionary War and the War of 1812. The Atlantic Ocean was to the east and south. Travel to the moon as well as building large underground cities is not feasible even today That left west as the only available option. Even the large chunks of Mexico taken in a few wars were to the west.
The original contributor has a great sense of humor. As noted by trips to the moon & underground cities. Also the Atlantic Ocean was amusing as well. The answer as to why America expanded to the West had little to do with Canada & the British.
But yes the land acquired as a result of the Mexican War, along with the Louisiana Purchase, greatly increased the size of American territory. A few reasons as to why, also involve the California Gold Rush and the Transcontinental railway which linked the Eastern part of the USA to the West coast. The Federal government encouraged the railroads with generous land lease deals to increase the entire USA railway system. This helped improve access to the western plains and the west coast. Previously wagon trains were the best and slowest way to reach new areas and form new communities. With help from the US Army, settlers breached American Native treaties and the rich and fertile western lands were available.
With the increase in population of America due to Immigration, jobs in the east became more difficult to find, free farm land was available to the west, and the American culture was based around expansion so it became common to migrate west.
People traveled West during Manifest Destiny to settle the previously unknown areas of the country. It was the goal of the President Andrew Jackson to expand the freedoms known to the states on the East Coast throughout the rest of the nation.
purchasing land west of the Mississippi river and forcing native Americans to move west
Many Americans bought into the idea of "Manifest Destiny"-- the belief that Americans were supposed to expand the country and acquire new territory. While this sounded good in theory, it often meant taking land away from the people who were already living there-- usually, the Indians (today called Native Americans). Other Americans moved west because it was there-- they liked the adventure and challenge of establishing a new town or city, and they were excited to be the first people in that place.
Native Americans in the west.
manifest destiny
It was the belief most Americans had at the time that they wanted to expand the country from the east coast to the west coast
Yes :D
It was the belief most Americans had at the time that they wanted to expand the country from the east coast to the west coast
The sense that Americans were destined to spread west to the Pacific Ocean
People traveled West during Manifest Destiny to settle the previously unknown areas of the country. It was the goal of the President Andrew Jackson to expand the freedoms known to the states on the East Coast throughout the rest of the nation.
people expanded west because they thought it was gods decision for those who lived east to move westward expanding to the pacific ocean. Many groups moved West some of which, the Common Man, African Americans,Natives, Mexican (chicanos) e.t.c
it helped the U.S. expand west in the middle of the manifest destiny
purchasing land west of the Mississippi river and forcing native Americans to move west
Evidence that proves that the Americans might expand into British Norh America was the fact that after the civil war, the Americans moved westward, following their belief of Manifest Destiny. The Americans conqured lands controlled by Mexico, spain, France, and Great Britain. The British North American colonies were afraid that the Americans would take over their western lands and that they would invade again, as they did during the American Revolution in 1812. There was a possibilty that the americans would have a majority to be able to annex the territory if they migrate into the North-West of Britsh North America.
to expand
yes it was their american right
Many Americans bought into the idea of "Manifest Destiny"-- the belief that Americans were supposed to expand the country and acquire new territory. While this sounded good in theory, it often meant taking land away from the people who were already living there-- usually, the Indians (today called Native Americans). Other Americans moved west because it was there-- they liked the adventure and challenge of establishing a new town or city, and they were excited to be the first people in that place.