answersLogoWhite

0

When the US become a country?

User Avatar

Anonymous

13y ago
Updated: 4/27/2023

The signing of the Treaty of Paris in 1783 solidified its independent state from England. The U.S. officially became a country in 1776.

User Avatar

Danika Abbott

Lvl 10
2y ago

What else can I help you with?