To some Americans, Hawaii is seen as a tropical getaway island that belongs to their government, but in reality, it is more of a colony, that was brutally taken over…
Exposing the truth one lie at a time
To some Americans, Hawaii is seen as a tropical getaway island that belongs to their government, but in reality, it is more of a colony, that was brutally taken over…