I’m trying to wrap my head around how the USA and Japan shifted from being fierce enemies during World War II to becoming close allies in just a few decades. It seems like a huge turnaround in international relations from an American perspective. What happened and why this dramatic change?
In: Other
Lots of good answers here, but another piece of historical context for rebuilding Japan: Political experts already knew the consequences of ignoring a country they defeated and demilitarized during a war since that is exactly what caused WW2 in the first place. There was zero desire for that mistake of WW1 to be repeated
Latest Answers