How did Germany stop being Nazis after the end of WWII? Did everyone just “snap out of it” after Hitler’s death?

758 views

How did Germany stop being Nazis after the end of WWII? Did everyone just “snap out of it” after Hitler’s death?

In: Other

25 Answers

Anonymous 0 Comments

A lot of people didn’t stop. They just stopped talking about the past. Or they talked about their views only in private.

Also, a lot of powerful people just kept their influence, and the institutions they worked for helped them by keeping quiet so as not to draw negative public scrutiny to the institutions past (there were a lot of outspoken Nazis in academia for example).

At some point it was then decided that the matter was dealt with and a thing of the past, but in reality victims often did not get justice, and their neighbors still had kept their racist views, and did not treat them better then before.

At least that is what happened where I am from.

You are viewing 1 out of 25 answers, click here to view all answers.