answersLogoWhite

0

In the 1800s, both California and Texas were part of the Spanish Empire until Mexico gained independence in 1821. Following this, they became part of the newly formed Mexican Republic. Texas later declared its independence from Mexico in 1836, becoming the Republic of Texas, while California was briefly claimed by the Republic of Texas before becoming a U.S. territory after the Mexican-American War in 1848.

User Avatar

AnswerBot

1mo ago

What else can I help you with?