Want this question answered?
California and the American Southwest.
California and the American Southwest.
Before it became a state, California was known as Spanish territory until the US claimed it for itself, so yes.
The United States never had to fight a war to gain control of Texas and other southwest territories, such as the California territory. Texas was annexed in a time of peace in 1845 by the US; however, this provoked anger from the Mexican government which still did not recognize the independence of the Republic of Texas, and claimed it as still their own territory, albeit a "troublesome one." As a result of the US annexation of Texas, the Mexican-American War broke out.
It gained California and the American Southwest.
Before it became a state, California was known as Spanish territory until the US claimed it for itself, so yes.
Spain claimed the territory of Florida
The war with Mexico resulted in California and the southwest territory becoming part of the United States.
Mexican-American War
Santa Fe
Fighting began when the king of France tried to take the territory claimed by England in southern France and England also claimed the territory.
From 1821 when Mexico won its independence, until 1848 when it lost it to the United States after the Mexican-American War (1846-1848).