How did Germany stop being Nazis after the end of WWII? Did everyone just “snap out of it” after Hitler’s death?

719 views

How did Germany stop being Nazis after the end of WWII? Did everyone just “snap out of it” after Hitler’s death?

In: Other

25 Answers

Anonymous 0 Comments

One really important thing was the Marshall Plan. After WWI the Germans were punished and put in an economic situation that made Hitler’s nationalism attractive. So after WWII, the Americans poured a lot of money into rebuilding Germany. This made it much more attractive to give up on the Nazi failure and become part of a more cooperative European and world order.

You are viewing 1 out of 25 answers, click here to view all answers.