answersLogoWhite

0

First, detractors and opponents of United States foreign policy often refer to it as "U.S. Imperialism." But, sadly, such foreign policy is not unique, and the U.S. is not the only country to be accused of it. Throughout history, many countries have invaded other countries or set up colonies or enslaved entire populations. Imperialism, the attempt to enforce the power and influence of your country on Another Country (often by force, but sometimes by trade policies), has been practiced by such countries as England, France, Germany, Belgium, Italy, Russia and China, just to name a few.

When America has been accused of imperialism, it was usually because a president wanted to establish a foreign policy favorable to America, whether other countries liked it or not. Sometimes, this involved secretly helping to overthrow a leader seen as anti-American (as happened in Guatemala in the mid-1950s), or sometimes it involved starting a war: for example, when President Bush decided to invade Iraq, some of America's critics accused the president of imperialism, of invading another country to gain control of its Natural Resources (in this case, oil). But a number of U.S. presidents, both Republican and Democratic, have made decisions to become involved in foreign disputes, usually because the American government believed such actions would benefit the United States in some way.

User Avatar

Wiki User

13y ago

What else can I help you with?