answersLogoWhite

0


Best Answer

America has never been known as the US.

The US is a country on the continent of America

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: When did America become known as The US?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

When did the US first become known as the The United States of America?

After the civil War


When did the US become known as The US of the US?

It has never been known as the US of the US. It is the USA (United States of America).The most prominent document to state the birth of the nation was the Declaration of Independence (1776), which was entitled "The unanimous Declaration of the thirteen united States of America."


When did America become known as the United states of America?

1776


When did America become the us?

1778


What were the effects of of the Spanish American War?

Cuban Independence. America become known as a national power.


Was the US of America ever called the US of north America?

No the United States of America was never known as the United States of North America.


When did South America become a coutry of the US?

South America is independent of the U.S.


Who is the president in the US in America?

Barack Obama better known as "downfall of America"


What did the Trail of Tears become known as?

it was known as the worst years in the us.


Why is the US known as the US?

US Stands for United States. As in United States of America.


What region in the US is Known as the America's Gateway?

deleware


What Sports originated in the 18th century in North America and has become known as America's Favorite Pastime?

soccer?