When was the US officially a country?

It is important to know when a country officially became a country. The US officially became a country when it was finally recognized foreign powers in 1783.