eli5 What is the term “western” mean. Is it North America, Europe, Australia, and New Zealand

286 views

I have been seeing people in the internet saying Western Cloth, Western Media, Westernizing, Western Movies, Western Nations. What does the term “Western” mean. Can anyone explain to me like I am 5 years old

In: 3

10 Answers

Anonymous 0 Comments

Europe is “the West”, because before Europeans knew of America etc. they were the furthest west of the part of the world they knew about (“old world”). North America Australia and New Zealand are part of “the West” because they were colonised by Europeans and therefore have similar culture. Western countries are those, whose culture and politics today are mostly or entirely influenced by Europe.

You are viewing 1 out of 10 answers, click here to view all answers.