Yes
It did not. Check your dates. Hawai'i has been a state since 1959.
Hawaii was originally a territory of the United States but was annexed in 1959.
hawaii Hawaii
No, it was within Hawaii, a territory belonging to the United States.
Hawaii was annexed by the United States as a key territory in the Pacific.
Hawaii
The United States Congress
Hawaii
Pearl Harbor is in Hawaii, United States of America
Hawaii
Hawaii
Hawaii belonged to the natives who inhabited it, and shortly after the Revolutionary War started, in 1778, James Cook, of Britain, discovered it. It did not become a territory of the United States until we annexed it in 1898.