I know I've posted in this thread before. But seeing the thread title again led me to have a few inquisitive thoughts.
So, like, what even is "The West"?
This is a genuine and sincere question. What does the phrase "The West" mean, exactly? We would need a clear definition of "The West" in order to even talk about its "fall".
Because I can think of several things which "The West" means.
The most obvious is "Western Civilization", and I know that there are some easy definitions of what that is. A summary might be that it is the collective whole of Greco-Roman and post-Greco-Roman civilization that has chiefly dominated in Europe (especially, or maybe specifically, Western and Central Europe, including the British Isles). And possibly a side-definition of "Christian Civilization" though that actually creates far more questions than it answers (given the importance of Christianity, historically, outside of Europe). So maybe we'll stick to the Europe definition?
Perhaps a summary of "Western Civilization" is the civilizational structures which emerged out of the fall of the Western Roman Empire in the 5th century. This would then, also, include the colonial projects undertaken in the 15th, 16th, and 17th centuries; as such the United States would be included as it is a colonial nation rooted in the colonial projects of several European powers, chiefly the British. Then Canada, Australia, much of Latin America, would then also be included.
That raises a further question. Is Latin America included in the definition of "The West"? Again, genuinely curious. Because if the United States is part of the "The West" due to being part of the larger milieu of "Western Civilization" as a product of European colonial endeavors--then the same must also be true of Mexico, Brazil, Jamaica, Honduras, etc.
Would, then, a "fall of the West" be inclusive of Latin America?
But--okay--more questions.
While colonial projects sometimes resulted in independent ex-colonies; there are also examples of colonial projects which have resulted not so much in independent colonies as independence against colonization--India, Pakistan, and Bangladesh for example, but also most African nations. Also many Asian nations, both West Asia (the Near East) and South-East Asia, such as Malaysia. But would we include Iraq, Mozambique, and Indonesia as "the West"?
I mean, to varying degrees, certain historically "western" ideas and values have been incorporated in large parts of the world. We live, now in the 21st century, in a world of nation-states, and most--arguably--adhere to some form of westernized democracy. Though this "Westernization" is seen not only in post-colonial nations, but also in those which were never formally colonized, for example Japan has a Parliament. China is officially a single-party Communist nation, having adopted USSR-style Communism through the Maoist revolution. Is Japan part of "The West" because it has adopted a democratic form of government and generally subscribes to Liberal-Secular principles?
Wait, does "The West" refer to Liberal-Secular ideas such as the nation-state, individual liberty, democracy, etc? Is that what "The West" is? Liberal-Secularism? Both Capitalism and Communism are the products of Western Liberal political philosophy.
Okay so maybe I'm just not entirely clear on what "The West" is that speaking of "the fall of the West" is fully meaningful.
Could someone help me out here?