eli5 What is the term “western” mean. Is it North America, Europe, Australia, and New Zealand

285 views

I have been seeing people in the internet saying Western Cloth, Western Media, Westernizing, Western Movies, Western Nations. What does the term “Western” mean. Can anyone explain to me like I am 5 years old

In: 3

10 Answers

Anonymous 0 Comments

The idea of “The West” has meant different things to different people. That said, it generally refers to Western Europe and its majority white former colonies: USA, Canada, New Zealand, Australia. Clothes, media, and movies that originated in these places are often referred to as Western. Western Europe is generally the following countries and everything west of them: Italy, Austria, Germany, Sweden, Norway.

During the cold war “The West,” and “The East,” referred to the two different sides of the conflict. During colonial times, it was used by those who considered themselves ideologically descended from the ancient greeks. While in Europe, places like Poland and Hungary may consider themselves firmly “not western,” in the rest of the world it is often used as a proxy for whiteness and/or (neo)colonialism.

These countries are considered Western by some:

Mexico, South Africa, Finland, Czech Republic, The Baltic States, Slovenia, Greece, and Japan.

You are viewing 1 out of 10 answers, click here to view all answers.