No, Hawaii is not a colony of the United States. In fact,
those of us who are from Hawaii tend to get pretty annoyed when people talk about
leaving Hawaii to go to the United States. We call the rest of the US "the
Mainland..."
Anyway, Hawaii is actually one of the states
of the United States. It has been a state now since it became the 50th (and last) state
of the Union in 1959.
Before that, Hawaii was a territory
(so I guess you could call it a colony) of the US. The US got it in the 1890s by
overthrowing the last queen of Hawaii, Queen Liliuokalani.
No comments:
Post a Comment