answersLogoWhite

0


Best Answer

Well generally speaking the United States territories were in Puerto Rico, Philippines, and also the Hawaiian Islands. There are a few others but I can't exactly remember where they had a large influence in, but in the Philippines and Hawaii, they had huge influence over the people there.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

15y ago

Ha!

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Where did US gain territories in imperialism?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Which word best fits Beveridge's views in his speech?

Imperialism: He supported American control of territories


What terrorites did the us gain as result of the war?

the territories that the us gain as a result of the war is freedom


How did the US gain land through imperialism?

In the Spanish American War.


How did the US gain territories?

For the most part they bought the territories. This allows the US to expand to other parts of the world.


Imperialism- What did the US gain from the Treaty Of Paris 1998?

There is no 1998 Treaty of Paris.


What is the historical importance of Imperialism?

Imperialism is important because it was a way in which countries gain power and improve their economic situation. Although it was important, it had some bad consequences in some territories, since they were conquer by other people. Imperialism lead to many wars, and many people was killed


How did the US gain the territories of the Philippines and Puerto Rico?

They were ceded to the US by Spain after the Spanish American War.


How was 19th century imperialism different from older forms of imperialism?

Older forms of Imperialism were more concerned with establishing colonies in foreign territories.


How did European imperialism following the industrializon revolution differ forms of imperialism?

It was less concerned with conquering and governing territories.


Why did US want overseas territories?

To gain more land and power being very stratigic


What is the policy that in which strong nations control weaker countries or territories?

Yes, Imperialism refers to the policy in which strong nations extend their political, military, and economic control over weaker territories.


What is policy in which strong nations control weaker countries or territories?

Imperialism