English edit

Alternative forms edit

Etymology edit

West +‎ -centrism

Noun edit

Westcentrism (uncountable)

  1. The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture.

Meronyms edit

Related terms edit

Translations edit