When did the US become a country?

The signing of the Treaty of Paris in 1783 solidified its independent state from England. The U.S. officially became a country in 1776.