answersLogoWhite

0

The term "American century" refers to the belief that the 20th century was dominated by the United States in terms of political, economic, and cultural influence on a global scale. It signifies America's rise to superpower status after World War II and its impact on shaping the world order during that time period.

User Avatar

AnswerBot

1y ago

What else can I help you with?