Well generally speaking the United States territories were in Puerto Rico, Philippines, and also the Hawaiian Islands. There are a few others but I can't exactly remember where they had a large influence in, but in the Philippines and Hawaii, they had huge influence over the people there.
Ha!
Imperialism: He supported American control of territories
the territories that the us gain as a result of the war is freedom
In the Spanish American War.
For the most part they bought the territories. This allows the US to expand to other parts of the world.
There is no 1998 Treaty of Paris.
Imperialism is important because it was a way in which countries gain power and improve their economic situation. Although it was important, it had some bad consequences in some territories, since they were conquer by other people. Imperialism lead to many wars, and many people was killed
They were ceded to the US by Spain after the Spanish American War.
Older forms of Imperialism were more concerned with establishing colonies in foreign territories.
It was less concerned with conquering and governing territories.
To gain more land and power being very stratigic
Yes, Imperialism refers to the policy in which strong nations extend their political, military, and economic control over weaker territories.
Imperialism