eli5 What is the term “western” mean. Is it North America, Europe, Australia, and New Zealand

277 views

I have been seeing people in the internet saying Western Cloth, Western Media, Westernizing, Western Movies, Western Nations. What does the term “Western” mean. Can anyone explain to me like I am 5 years old

In: 3

10 Answers

Anonymous 0 Comments

It means those whose civilisations are seen as being descendants (even if not directly) of the Greeks and Romans. So those whose cultures and laws are based in ancient Greek and Roman laws and traditions.

It contrasts to the East which is Asia (because it was east from Europe).

So in this North America, Australia, New Zealand are Western countries/areas. Japan is also argued to be rather Westernised as it has taken a lot of its culture, laws, etc from the West.

You are viewing 1 out of 10 answers, click here to view all answers.