I assume by "the colonies" the question refers to the Thirteen Colonies which became the United States. The document that recognized the independence of the United States was the Treaty of Paris of 1783.
The Treaty of Paris says that England gave up her American colonies and any claims to land east of the Mississipi. The colonies were now free and independent states.
A big one
Deutschland (germany)
the war and treaty of paris
Treaty of Paris in 1782The Treaty of Paris, the peace treaty between the United States and Britian.
I assume by "the colonies" the question refers to the Thirteen Colonies which became the United States. The document that recognized the independence of the United States was the Treaty of Paris of 1783.
The Treaty of Paris says that England gave up her American colonies and any claims to land east of the Mississipi. The colonies were now free and independent states.
A big one
Treaty Of Paris
it was in 1783 with the treaty of paris
Deutschland (germany)
The Treaty of Paris
The Treaty of Paris.
Treaty of Paris
The Treaty of Paris was the treaty whereby Great Britain recognized the American colonies as an independent nation. This was well after the 1776 publication of the Declaration of Independence.
Treaty of Paris, 1783.