I’m trying to wrap my head around how the USA and Japan shifted from being fierce enemies during World War II to becoming close allies in just a few decades. It seems like a huge turnaround in international relations from an American perspective. What happened and why this dramatic change?
In: Other
Japan – like Germany – was high jacked by nationalist extremists who threw the people into war they didn’t want. Unlike Germany, however, the people were disenfranchised well before hand. Japan and the West quickly became allies because the US helped the people of Japan effectively institute a real liberal democracy with universal suffrage, something people had been trying to do well before the war.
It’s also important to note that Japan had been an ally BEFORE the war, during and after WWI.
It’s better to ask why Japan turned against the west between the wars (hint: white supremacy was mainstream in the west)
Latest Answers