A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Are The Bahamas a U.S. territory or does the U.S. own them?

Best Answers

No, the Bahamas is not a U.S. territory. It became a British Crown Colony in 1718, and gained independence in 1973. A U.S. citizen can acquire an Enhanced Drivers License (EDL) in some states, or a U.S. Passport Card anywhere. read more

The Commonwealth of The Bahamas is NOT NOR HAVE WE EVER BEEN a U.S. territory or owned by the U.S. EVER. We were a colony of Great Britain (United Kingdom) since the early 18th century and remained a British colony until July 10, 1973 when we became a sovereign nation with our own constitution and laws and the right to govern ourselves. read more

You may be thinking of the U.S. Virgin Islands, which were sold to the US by the Danish, in 1916. They are currently an unincorporated area of the U.S. read more

Since before the U.S. ever existed, the Bahamas were fought over by the British, French, and Spanish. The Bahamas gained their independence from the United Kingdom in 1973, but continue to be a part of the Commonwealth of Nations. read more

Encyclopedia Research

Wikipedia:

Related Facts

Image Answers

Why is the U.S. so interested in oil? - Quora
Source: quora.com