I assume by "the colonies" the question refers to the Thirteen Colonies which became the United States. The document that recognized the independence of the United States was the Treaty of Paris of 1783.
The Treaty of Paris says that England gave up her American colonies and any claims to land east of the Mississipi. The colonies were now free and independent states.
the treaty of paris which was signed on september 3, 1783
Treaty Of Paris
The signing of the Treaty of Paris in 1783 solidified its independent state from England. The U.S. officially became a country in 1776.
The Treaty of Paris.
The Treaty of Paris got signed in 1783. Its purpose was to make the British recognize that the colonists were free and independent.
Because it was independent.
Treaty of Paris (1673)
Treaty of Paris
England acquired Louisiana.
True