the difference between American liberalism and American leftism.

1.77K viewsOther

the difference between American liberalism and American leftism.

In: Other

19 Answers

Anonymous 0 Comments

You have a center (centrist).  To the left of that center is a range of political philosophies, ranging from left-of-center to far left, getting more extreme the farther to the left you go.  The same is true for the right.  Liberalism is a political philosophy that’s a bit left of center, the most common left-wing philosophy in the United States.  In general, when someone says “the left”, they’re referring to any of those philosophies that exist left-of-center, though what they actually mean really depends on context.  Typically, in America, when someone says “the left” they’re referring to either liberalism or progressivism.  If someone refers to someone else as a “leftist” they typically mean someone who they think is deep into the left, but it also depends who is saying it as some mean it in terms of identity, others mean it in a derogatory way.  In my experience, if someone refers to themselves as a leftist they’re typically progressive or socialist; whereas if it’s coming from the opposing side (someone on the right), they’re basically saying that they’re in the far left portion (beyond socialism).

You are viewing 1 out of 19 answers, click here to view all answers.