answersLogoWhite

0

When did the US gain it's rights?

User Avatar

Anonymous

17y ago
Updated: 8/16/2019

The first ten Amendments to the Constitution, which is the Bill of Rights, were adopted December 15th, 1791. These are rights granted to the people. The United States itself "gained its rights" would be when the thirteen Colonies ratified the Constitution on July 4th, 1776.

User Avatar

Wiki User

17y ago

What else can I help you with?