eli5 What is the term “western” mean. Is it North America, Europe, Australia, and New Zealand

280 views

I have been seeing people in the internet saying Western Cloth, Western Media, Westernizing, Western Movies, Western Nations. What does the term “Western” mean. Can anyone explain to me like I am 5 years old

In: 3

10 Answers

Anonymous 0 Comments

Picture a map, make an imaginary line across the middle from top to bottom. The region to the left of that line is the West and the region to the right is the East. Simply speaking, West is Americas and Europe, East is Asia, in the middle sit Africa and Middle East, and Russia is both in Europe and Asia.

Western can mean countries to the West of your imaginary line and the cultures tied to them, same goes for the East. However, geography is only one factor, culture plays a bigger part. Countries in West Africa being called “Western” wouldn’t sound correct wouldn’t it?

For example, although Australia and New Zealand are much closer to Asia and can be considered Eastern countries geographically, their culture leans closer to the West due to European colonization, and even past that time period they’re still closer to the Western world than they are to the East.

Basically, Western can mean many things, whether it be the country, its culture, language, or influence.

You are viewing 1 out of 10 answers, click here to view all answers.