1914 – The West Starts Dying

Most competent historians can trace the decline of the West to World War I, which started 100 years ago. The slaughter on the fields of Flanders, the subsequent influenza epidemic, the dissolution of centuries-old European monarchies and empires, and a global depression, led to a sort of cultural post traumatic stress syndrome after the war.

All the more amazing is that the West had totally won the war. All the retrograde powers had been defeated. The victors were America, France, and Britain, the very core of the Western ideal. They were uncontested and unbeatable.

But the victory was Pyrrhic.

Share