Most Americans don't think of the United States as an empire. We're a "republic," the good guys. Empires are bad. But is the United States an empire? Short answer, yes. I discuss a new book on the topic and why this is true in this episode of The Brion McClanahan Show.
https://mcclanahanacademy.com
https://brionmcclanahan.com/support
http://learntruehistory.com