How did Germany stop being Nazis after the end of WWII? Did everyone just “snap out of it” after Hitler’s death?

747 views

How did Germany stop being Nazis after the end of WWII? Did everyone just “snap out of it” after Hitler’s death?

In: Other

25 Answers

Anonymous 0 Comments

In addition to what’s been mentioned above, in Germany they teach about the Holocaust. They don’t try and pretend it didn’t happen or that it was just a few ppl participating. They take full blame and teach how it was wrong.

You are viewing 1 out of 25 answers, click here to view all answers.