Member-only story
Is The United States An Empire?
A look at US holdings around the world
There are some people who call the United States an empire in a derogatory way. These people rant about American Imperialism and claim that the United States is dominating the world in an unhealthy way. These phrases often come up in radical political jargon or in opinion pieces that take aim at America’s place on the world stage. But is it accurate? Is the United States an empire? Let's look at the definitions, some historical context, and the extent of American power in the world and come up with a verdict.
The Definition of Empire
According to Britannica, the definition of empire is as follows:
Empire, major political unit in which the metropolis, or single sovereign authority, exercises control over territory of great extent or a number of territories or peoples through formal annexations or various forms of informal domination.
Merriam-Webster defines empire as:
a major political unit having a territory of great extent or a number of territories or peoples under a single sovereign authority
especially : one having an emperor as chief of state
From these two comprehensive definitions, we see that an empire requires two things and…