The Treaty of Paris is more favorable to the United States.
I assume by "the colonies" the question refers to the Thirteen Colonies which became the United States. The document that recognized the independence of the United States was the Treaty of Paris of 1783.
Treaty of Paris in 1782The Treaty of Paris, the peace treaty between the United States and Britian.
yesterday
The United States of America ...
The top two main points of the Treaty of Paris was to acknowledge the United States of America as it's own free governing country. Secondly, the establishment of clear borders and boundaries between the U.S and Britain.
The Treaty of Paris, signed in 1783, recognized the independence of the United States and defined its boundaries. The treaty established the borders of the new nation, extending from the Atlantic Ocean to the Mississippi River and from Canada to Florida. This agreement formally ended the American Revolutionary War and acknowledged the sovereignty of the United States.
The United States of America ...
The Treaty of Paris formally ended the American War for independence. The two countries that signed the Treaty of Paris were the United States and Great Britain.
When the United States paid $20million as set out in the Treaty of Paris
The Treaty of Paris in 1783, it said said that Britain recognized the United States as a independent country and officially ended the American revolution. It also stated the America's alliance with France. The boundaries of the US weren't altered much by the treaty but they gained access to the west and spread out through the colonies. hope this helps=)
The Treaty of Paris was a triumph for the Americans. Great Britain recognized the United States as an independent nation