answersLogoWhite

0


Best Answer

The belief that the United States was mandated by God to cover the New World from "coast to coast" was known as Manifest Destiny. Although this was popular belief among the people of the newly-born nation, it was not an actual right given to them.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: The belief that the US has the right to expand is called what?
Write your answer...
Submit
Still have questions?
magnify glass
imp