Westcentrism
English edit
Alternative forms edit
Etymology edit
Noun edit
Westcentrism (uncountable)
- The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture.
Meronyms edit
Related terms edit
Translations edit
practice
|