Westcentrism
English
editAlternative forms
editEtymology
editNoun
editWestcentrism (uncountable)
- The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture.
Meronyms
editRelated terms
editTranslations
editpractice
|