answersLogoWhite

0

How did Hawaii become American?

Updated: 12/19/2022
User Avatar

Wiki User

12y ago

Best Answer

america just was just greedy, so they just took as much land as they could, and as

you can see, Hawaii was accidentally taken by the US, and was the last state taken by the US, so that is the answer to this.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How did Hawaii become American?
Write your answer...
Submit
Still have questions?
magnify glass
imp