answersLogoWhite

0

No, not all. Of course, some did see it as beneficial to align with, and become part of the United States. Others, however, saw it as a forceful takeover of their native lands. Nowadays, it has become very unpopular for native Hawaiians to speak in support of the annexation. Activists believe it to have been wrong and are seeking reparations, some even advocating to break away from the United States all together. It would seem foolish, however, to believe that had the US not had a presence in the Hawaiian Islands, Another Country would not have "annexed" it for themselves.

User Avatar

Wiki User

7y ago

Still curious? Ask our experts.

Chat with our AI personalities

ViviVivi
Your ride-or-die bestie who's seen you through every high and low.
Chat with Vivi
MaxineMaxine
I respect you enough to keep it real.
Chat with Maxine
ReneRene
Change my mind. I dare you.
Chat with Rene
More answers

No.

User Avatar

Wiki User

12y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Is it true All Hawaiian natives approved of the U.S. annexation of Hawaii in 1898?
Write your answer...
Submit
Still have questions?
magnify glass
imp