What is Americentrism?
1.
The morally and fundamentally correct elitist belief that America is by far the single greatest nation in the history of the world.
"The fact that we have saved the world on multiple occasions from certain destruction is a testament to a true belief in Americentrism"
See