the british gained the territory! - social studeis homework.... i know
During the period of 1754-1763, British gained territory in North America by defeating the French in the French and Indian War. As a result of the Treaty of Paris in 1763, France ceded Canada and all its territories east of the Mississippi River to Britain, and Spain ceded Florida to Britain.
I know but like pls
The countries that gained t he most territory in Africa were European countries. Countries like Great Britain and France took over much of Africa.
Only from the Habsbugic Empire.
romania and greece
Romania and Greece
meee
it was the indians
The Mexican American War.
The answer is Cuba.
Spain itself.
It gained California and the American Southwest.
Six countries that gained territory after World War 1 include France, which acquired Alsace-Lorraine from Germany; Italy, which gained territory from Austria-Hungary; Romania, which expanded its borders and gained territory from Austria-Hungary; Greece, which acquired Western Thrace from the Ottoman Empire; Poland, which gained independence and expanded its borders; and Czechoslovakia, which emerged as a new country and gained territory from Austria-Hungary.