THEORY OF THE END - Volume I: The Future is Canceled
The first volume of "THEORY OF THE END, Version 2"
A Note
This volume compiles improved and expanded versions of essays that were previously published on this Substack blog as “parts 1 through 9” of “THEORY OF THE END.” If you have already read those entries, I’m confident that you will find these new editions to be superior and that they cohere more effectively with subsequent parts (the updated versions of which will be uploaded to this blog as Volumes II and III at a later date).
Whether you are new or returning, thank you all very much for reading.
Prelude
“I am Asura incarnate,” the poet Kenji Miyazaki once wrote. “Spitting, gnashing, pacing back and forth.”
The Asura, sometimes called the “jealous gods,” were said by the ancient Buddhists to live at the foot of the towering Mt. Sumeru. While outwardly appearing as heavenly deities, they inhabit a space closer to the realm of beasts, constantly waging war amongst themselves and with other beings and engaging in petty conflict, never knowing a moment of true peace. Their grand palaces, potentially glorious shimmering monuments, are devastated by constant fighting, and the Asura prowl through them with crazed, bloodshot eyes, unable to relax in the jeweled luxury of their surroundings knowing that they could be subjected at any time to new unspeakable agonies.
Such is the fate of the Asura; to be rended with jagged blades, flayed, stabbed, scorched, beheaded, trampled by divine steeds, locked in cages and tortured, all in the beautiful light of “heaven’s sea of splendor.” And when the rains of spring bring with them thunder and lightning, the craven Asura tremble and cry out in terror, mistakenly believing that they've heard the ferocious war drums of the heavenly devas.
In a sense, they are the picture of squandered potential. If they gave up their contentious animalistic ways and unified, they would certainly achieve glories rivaling that of the mighty gods at the peak of Mt. Sumeru. Yet they find themselves unable, instead resigning themselves to their violent ends. And so this brutality continues for hundreds, thousands, even millions of years unabated.
"And all being is flaming sorrow,” wrote Franz Marc on the back of one of his paintings. The image itself depicts a ghastly scene of animals being torn asunder in a mysterious orgy of chaos. “The trees show their rings, the animals their veins.”
Perhaps it is not only Miyazaki who was “Asura incarnate.” But if we have collectively found ourselves in the dismal realm of the Asura, should we too accept our current condition and let it carry us into a future of unending sorrow? Or should we see the crystalline light of spring and “breathe the sky anew?”
“Ah, at the bottom of the brilliant April, gnashing, burning, going back and forth,
I am Asura incarnate.”
Chapter 1: The Final War
“Pack your bags.
We're taking off.
No turning back.
We are all gone.”
-Vein.fm, from “Funeral Sound”
For as long as humanity has existed, we have had predictions of the “end,” either of an apocalyptic nature or of hypothetical civilizational stagnancy. Some, like the Aztecs, Buddhists, and Vedic priests, saw the universe as cyclical, with the eventual end always leading to a new beginning, while some more materialistically-minded individuals currently predict that natural disasters will worsen until the earth experiences a massive calamity, ultimately resulting in complete human extinction.
Others, like the WWII-era Japanese general Ishiwara Kanji, had ideas of a very different nature. Ishiwara had spent much time studying military strategy and surveying the technological landscape of his time, and he combined this knowledge with the Kamakura-era Buddhist priest Nichiren’s predictions of great Japanese conflict and eventual propagation of Lotus Sutra Buddhism in the age of decline, resulting in the “unification of the entire people under the heaven within four seas under the teachings of the Lotus Sutra” (一天四海皆帰妙法, Itten Shikai Kaiki Myoho).
The final result of his studies was what he called the “final war theory” (最終戦争論, saishu senso ron). In early 1940, on the eve of the second world war, he made the following chilling prediction in his lecture on “The Final War”:
The time when a decisive battle by air force initiates across the Pacific Ocean will be the time of the last great decisive battle of mankind. In other words, it will be an age when airplanes can fly around the world non-stop, and weapons like those currently used in the European war will be rendered obsolete. The new weapons will be more thoroughly destructive, with a great power that we cannot even imagine. One strike will kill tens of thousands of people instantly.
However, as we speak, planes fly around the world without landing and weapons are rapidly becoming more advanced. If a worldwide war breaks out today, by the time the next morning dawns, capitals and major cities will be completely destroyed. Osaka, Tokyo, Beijing, and Shanghai will be in ruins. Everything will be blown away... I believe that's the extent of the destructive power. If that happens, the war will end quickly. The final war will not be won by deliberating over spiritual mobilization and total war. Such tepidness is only a factor in the age of wars of attrition, and will not be an issue in a decisive war. In the next decisive war, we must see that this is coming and act without hesitation. The person who can create such decisive weapons and endure the inevitable devastation will be the ultimate winner.
However, although Ishiwara emphasized the potential destruction that would be wrought from such a battle, his prediction was not necessarily an apocalyptic one. Should one side of the “final war” survive, they would gain the ability to determine the future of global human civilization itself. He continues:
War will end in this next decisive war, as it will push us to the limit of the development of war. Human fighting spirit, however, will not disappear. What does it mean that war will end without the elimination of fighting spirit? National conflict will disappear -- that is, the world will become one as a result of this next decisive war. Some may think my explanation far-fetched, but I am convinced that it is theoretically sound. Once we reach the limit to the development of war, future wars will become impossible…
The final war will not last long. It will be over in a short amount of time. I think that the most important destiny of humanity will be decided; whether the Emperor [of Japan] should be the Emperor of the world or the President of the United States should control the world. In other words, it will decide whether the Eastern Way of Kings or the Western Way of Force should be the guiding principle for world unification.
In order to clarify his above comment on humanity’s “fighting spirit,” Ishiwara later provides us with the following explanation:
The fighting spirit of humanity probably won’t disappear for the next few decades, or as long as humanity exists. On the one hand, the fighting spirit is the driving force behind civilizational development. However, after the final war, the instinctive urge to wield that fighting spirit in armed conflict between nations will naturally dissipate and will be transformed into another type of competition, namely the competition to peacefully build a more advanced civilization.
Looking at all of this from a modern lens, while it is incorrect in predicting the end of war itself, there are some elements of Ishiwara’s theory that are prophetic. For instance, he clearly foresaw the devastation that would be brought upon his country by American firebombings and, ultimately, the atomic bombs dropped on Nagasaki and Hiroshima. Technological innovation indeed introduced new horrors into this world that made the total annihilation of a country (if not the Earth itself) into a matter of merely flipping a switch.
But Ishiwara predicted something else, a detail which may go unnoticed to the casual reader; that being the hegemonic domination of America and the proliferation of what one may call “Liberal Democracy” across the globe. While “war” has certainly not ended as a fixture of human civilization, we have not seen any conflicts rise to the scale of the two world wars since the defeat of the axis countries. Moreover, on a surface level, it appears that the technically-oriented and consumeristic “American” way of life has swallowed up every developed and developing country across the globe, even those who outwardly proclaim an opposition to it like China.
In his failure to unify East Asia and defeat Western powers as he had wished, were we doomed to some unholy fate in the iron jaws of American Materialism? Ishiwara, at least, denied that this was the case, and in his final days disavowed his prior theories in favor of promoting an attitude of pacifism and peaceful propagation in Japan’s relationship with international powers going forward. Per the article “Nichirenism, Utopianism, and Modernity: Rethinking Ishiwara Kanji’s East Asia League Movement” by G. Clinton Godart:
In a certain way… the final war was to be a religious war, resulting in the inevitable victory of Nichiren Buddhism, the desirable end result. But Ishiwara and many other members were quite quick to drop the whole idea after 1945 and embrace pacifism. Ishiwara wrote that he “admits” that his prediction of a final war between Asia and the West “was a profound self-conceit and in fact a mistake” (Ishiwara Kanji Heiwa Shisō Kenkyūkai 1994, 200).
However, in the preceding decades, theories would crop up in America that would come to resemble Ishiwara Kanji’s in interesting and entirely unintentional ways. The figurehead for this line of thinking is Francis Fukuyama, a graduate from Cornell University and Harvard who became a regular face in various political thinktanks, the most notable of which being the RAND Corporation. In 1992, Fukuyama published his most famous work, one that would come to define him for the rest of his career: “The End of History and the Last Man.”
In this book, Fukuyama outlines a theory heavily inspired by Hegel’s ideas of the “dialectic” and “the end of history,” proposing that human civilization was leading to a maximally efficient end state in terms of societal configuration that would satisfy not only basic human material desires, but also their sense of “dignity” and “self-esteem.” He dubbed the latter of these factors the “thymos,” borrowing the term from Plato’s republic. “Thymos emerges in the Republic as being somehow related to the value one sets on oneself, what we today might call ‘self esteem,’” Fukuyama states in the book. He further elaborates as follows:
Thymos is something like an innate human sense of justice: people believe that they have a certain worth, and when other people act as though they are worth less — when they do not recognize their worth at its correct value — then they become angry.
We will cover this concept in more detail later down the road, but it is worth noting here as it could be seen as almost analogous to Ishiwara’s notion of “fighting spirit,” making the crossover between the two perspectives more than just superficial. However, in stark contrast to General Kanji’s views, Fukuyama believed the most satisfying dialectical end-point to be the ostensibly “Liberal Democratic” model exemplified by America and later spread globally in the aftermath of the second World War and the collapse of the Soviet Union. Fukuyama’s central thesis is, in a sense, a validation of Ishiwara Kanji’s theory around an ultimate global order, only from the perspective which views the materially-focused American “Way of Force” as the far more desirable victor.
This perspective is also shared by many modern day residents of Liberal Democracies, albeit in a far less sophisticated form. The most prevalent popular rendition has evolved into what I have termed the “march of progress” myth, which repositions the entirety of human history as a gradual advancement towards a perfected form of Liberal Democratic Progressivism (one which, we can always assume, conveniently conforms to present-day Progressive sensibilities). Some even more extreme actors even cling to this narrative as a sort of pseudo-spiritual promise, wielding it in service of whatever the Progressive cause of the week happened to be at the time.
Although I want to make it clear that this “march of progress” myth is more often a justification for beliefs and behaviors rather than a significant motivating factor in the adoption of philosophical and political positions, and diverges from Fukuyama’s narrative in significant ways. As we will learn, most positions on such an “end of history” can be attributed not to influence from Fukuyama himself (and most self-professed adherents have likely not even read his book on the matter), but to circumstances of our modern age, some which may even, at first glance, appear to be at odds with the “end of history” narrative.
In fact, it could be said that most individuals in our technologically advanced modern society have split themselves off from the continuum of time, viewing themselves not as participants in a long civilizational timeline, but rather as atomized “individuals” who have transcended the outdated collective mindset. Yet even while the static “self” reigns supreme, it is still the “march of progress” that is wielded as a cudgel against anyone who may oppose the flow of modernity. While the future is seen as assured, it is also in practice rendered only abstract and hyperreal.
Meanwhile, both Fukuyama’s narrative and the various popular iterations of it are currently fighting against a growing pessimism in the developed world thanks to a wide variety of social, economic, and political developments, causing what can be considered a “cancelation of the future” in the eyes of the public. Our current state of affairs could therefore be conceived as a battle between the “end of history” and history’s deterioration, with adherents of the former view trying to bail out the ship while it rapidly takes on water and those of the latter sinking into an inward-facing aimlessness; an altogether different kind of war from that which men like Ishiwara Kanji studied, and one in which the outcome is seemingly predetermined. Man is reduced to a spectator in his own sham trial, resigned to his inevitable verdict. The only difference is whether this fate leads to prosperity or mournful decay.
In representing opposition to Fukuyama, I’ve cited the Leftist thinkers Franco “Bifo” Berardi and Mark Fisher who, in their books “After the Future” and “Capitalist Realism” respectively, cast humanity’s technological and corporate-guided trajectory in a far more pessimistic light. In their view, Capitalism and the Liberal Democracies in which it flourished have mutated into something altogether less “liberal” and more inherently inhuman: a system which they both refer to as “neoliberalism.” While in some ways more dysfunctional than its predecessors, neoliberalism is also described as being more entrenched in the public’s collective consciousness, deranged yet inescapable; less an “end of history” and more of a one-way train ride into a cold dark wasteland. While I do hold an appreciation for the ideas and observations of the two above-mentioned writers, and my own views overlap with theirs to a certain degree, you will find by the end of this book that both my diagnosis and my prescriptions diverge from theirs in significant ways.
On all sides appears to be an attitude that the future is being eroded; that time itself is breaking down like an old rust-addled jalopy, yet in this haze another apparent contradiction emerges. All over the developed world, people rush to and fro with far more urgency than previous eras, putting pressure on both themselves and others to cram as much “productivity” into every hour as possible. The streets teem with cars every day, packed like the tunnels of an ant colony, horns honking and tires squealing with dreadful panic. Technological progress has also accelerated to a blinding speed, altering the way humans operate with every new and terrifying leap, and as a result the digital economy has thoroughly commodified time; companies do whatever they can to keep eyes on screens. Never before has time held so much value, but has been of so little importance.
But let’s back up and examine how we got here. In order to more fully understand our current predicament, I believe we should go back to a more optimistic time for Liberal Democracy in the early 90s and retrace our steps. Why exactly did the narrative put forward by Fukuyama fail to explain the societal developments which followed its publication? What caused such disastrous deviations from our promised “end of history?” I believe that answering this question will reveal some uncomfortable, but sorely needed truths about not only the spirit of modernity, but of man and his place in the universe. This is one reason I have landed on “The End of History and the Last Man” as the first representative of the modern Western philosophical zeitgeist in this work.
Another reason is that, unlike the related belief systems of our current age outlined earlier, Fukuyama’s book has a fair amount of scholarly and intellectual rigor put into it, or at least enough to warrant addressing it at length. At times, it does have the potential to come off as almost a pre-emptive victory march for proponents of American Liberal Democracy in the eyes of modern readers with the advantage of hindsight, and this is a point I have heard from present-day critics of the book. But I will not regard it in this way, instead taking Fukuyama at his word and assessing his analytical framework as an honestly-held worldview.
In return, I ask that my readers understand that this work is also not meant to be a celebration dance over the corpse of a failed sociopolitical theory. What’s important here is not just pointing out deviations from his theory, but diagnosing the reasons for those deviations in service of truly understanding the modern condition. After all, if the “end of history” is here (or at least immanent), why exactly do so many people now feel mistrustful not only of this supposed final system of governance, but of the future as well? Why does everything seem so grim? I believe I have found some answers to these questions, but it will take the rest of this work to thoroughly explain them.
For this book is, at its core, an exploration of something much deeper than Fukuyama’s thesis. It concerns itself with what I refer to as the “subjugation of time,” of which Fukuyama’s view of history is merely one manifestation. It is one of the defining features of our age, and at the intersection of several topics, all of which are integral to formulating a proper critique of modernity that extends beyond the artificial bounds of the material. Over the course of this work, we will gradually build up to a full understanding of this concept, taking some twists and turns as we go, and hopefully by the end my purpose in writing these many pages will have become clear.
But the first thread we will pull to unravel this sweater is, as I have indicated above, Fukuyama’s “The End of History and The Last Men.” Thank you for reading up to this point. I sincerely hope you will join me on the rest of this journey.
Chapter 2: The End of History?
“Now it’s 1999. Stuck until the end. Suffer till the end. Masochistic trend. Carson MTV. Bizkit NYE. Repeat and then repeat. You can never really leave. Your friends aren’t real. They’re the company you keep. Talking shit in a room long after you leave.
Is this all there is? Is this all there is? Fuck fuck fuck…”
-Foxing, from “Hell 99”
As stated earlier, in order to more fully understand the narrative presented in the book “The End of History and The Last Man,” we must first examine the global circumstances under which it was written.
To Fukuyama’s credit, the idea that the system of governance that he calls “Liberal Democracy” would spread throughout the globe was probably a more obvious conclusion at the tail end of the 80s. Fascism as a formal doctrine had effectively been annihilated in the developed world after the defeat of the axis powers in World War 2, and in 1987 Mikhail Gorbachev proclaimed his initiative to democratize the ailing Soviet Union via the slogan of “Demokratizatsiya” while its constituent republics were on the verge of independence. I doubt that it’s an exaggeration to say that “The End of History and The Last Man” was written in the midst of the soviet union’s collapse.
Thus with both of Liberal Democracy’s biggest geopolitical rivals, i.e. Communism and Fascism, out of the picture, the Liberal Democratization of the globe may have seemed all but certain. However, one must keep in mind that Fascism and Communism both inevitably fell due to their inability to prove their legitimacy in light of their espoused beliefs and values, something Fukuyama argues with much vigor in his writing, stating: “All regimes capable of effective action must be based on some principle of legitimacy.” Of Fascism, Fukuyama explains:
Fascism was not around long enough to suffer an internal crisis of legitimacy, but was defeated by force of arms. Hitler and his remaining followers went to their deaths in their Berlin bunker believing to the last in the Tightness of the Nazi cause and in Hitler's legitimate authority. The appeal of fascism was undermined in most people's eyes retrospectively, as a consequence of that defeat… Fascism suffered, one might say, from an internal contradiction: its very emphasis on militarism and war led it inevitably into a self-destructive conflict with the international system.
In other words, when Fascism promised military might and the supremacy of the peoples it championed, only to suffer devastating military defeat at the hands of supposedly inferior populations, its own raison d’être effectively became the cause of its ideological unraveling and future irrelevance.
Communism, on the other hand, promised to create a new man through the machinations of a revolution and subsequent Communist state in accordance with Marx’s thought that man’s very consciousness “changes with every change in the conditions of his material existence, in his social relations and in his social life.” To accomplish this, the USSR implemented a new governmental philosophy “backed by efficient police power, mass political parties, and radical ideologies that sought to control all aspects of human life.” This philosophy is now known as “Totalitarianism.” Fukuyama describes it thus:
The totalitarian state hoped to remake Soviet man himself by changing the very structure of his beliefs and values through control of the press, education, and propaganda. This extended down to a human being's most personal and intimate relations, those of the family. The young Pavel Morozov, who denounced his parents to Stalin's police, was for many years held up by the regime as a model Soviet child. In Mikhail Heller's words, “The human relations that make up the society's fabric — the family, religion, historical memory, language — become targets, as society is systematically and methodically atomized, and the individual's close relationships are supplanted by others chosen for him, and approved by the state.”
This, however, led directly to the most obvious failing of the USSR and really all Communist regimes, that being the overestimation of man’s malleability and the resulting inability to completely fit the roundness of man into the square peg of Marxist thought. “Soviet citizens, as it turned out, had all along retained an ability to think for themselves,” Fukuyama says. “Many understood, despite years of government propaganda, that their government was lying to them. People remained enormously angry at the personal sufferings they had endured under Stalinism.”
The other major failing, however, was economic in nature. “It was much more difficult to tolerate economic failure in the Soviet system because the regime itself had explicitly based its claims to legitimacy on its ability to deliver its people a high material standard of living,” explains Fukuyama.
Indeed, this shortcoming came into stark focus during Boris Yeltsin’s 1989 trip to a grocery store in Clear Lake, Texas, during which he reportedly shook his head in amazement as he wandered through the densely food-packed aisles. “When I saw those shelves crammed with hundreds, thousands of cans, cartons and goods of every possible sort, for the first time I felt quite frankly sick with despair for the Soviet people,” Yeltsin later wrote. “That such a potentially super-rich country as ours has been brought to a state of such poverty! It is terrible to think of it.”
It’s difficult to argue against the fact that the Liberal environment of America gave it an edge in terms of economic growth. Much of this can be attributed to the decentralized nature of a Liberal economy, which can respond far better to the increasing complexity of the economic and technological environment far more efficiently than a centralized bureaucracy. As Fukuyama states:
Bureaucrats sitting in Moscow or Beijing might have had a chance of setting a semblance of efficient prices when they had to supervise economies producing commodities numbering in the hundreds or low thousands; the task becomes impossible in an age when a single airplane can consist of hundreds of thousands of separate parts.
The importance of the technological aspect to this cannot be overstated, but we will indulge more on this topic later.
It should also be noted, however, that America’s abundance of food was not a simple matter of economic liberalization. Contrary to this narrative, the government played a large role in America’s victory in the cold war era “farms race.” “If U.S. agriculture policy was aggressive in earlier decades, then in the Cold War era, it was pretty much on steroids,” claimed Freakanomics Radio in a piece entitled “How the Supermarket Helped America Win the Cold War.” It continues: “And this wasn’t just about feeding a growing U.S. population. The policy had a political thrust, meant to show the Soviet Union — and the rest of the world — just how mighty the United States was.”
Early government policies ranged anywhere from price supports to insurance for farmers, and when demand dropped after the first world war, much of the unsold grain was used in the development of what we now call “factory farming,” or it was purchased by the federal government itself. Going forward, the United States put a great deal of effort into technologically enhancing farms and food distribution. The military was one of the channels through which this happened, as it justified and facilitated experimentation in service of preserving food for as long as possible. Many of these experiments have fundamentally altered the diets of US citizens. A 2015 NPR article entitled “Cheetos, Canned Foods, Deli Meat: How The U.S. Army Shapes Our Diet” states:
Many of the foods that we chow down on every day were invented not for us, but for soldiers.
Energy bars, canned goods, deli meats — all have military origins. Same goes for ready-to-eat guacamole and goldfish crackers… Many of the packaged, processed foods we find in today's supermarkets started out as science experiments in an Army laboratory. The foodstuffs themselves, or the processes that went into making them, were originally intended to serve as combat rations for soldiers out in the battlefield.
Indeed, military needs have driven food-preservation experiments for centuries.
It can be said, and has been said by the Greek economist and writer Yanis Varoufakis, that the American government and the Corporation-led American economy have been inseparably joined at the hip since at least the end of the second world war Varoufakis writes in his book “Technofeudalism: What Killed Capitalism”:
By the war’s end, American capitalism was unrecognisable. Business and government had become profoundly entwined. Indeed, the revolving doors between government departments and corporations saw to it that the same crowd of mathematicians, scientists, analysts and professional managers populated them both. The heroic entrepreneur at the helm of the corporation and the democratically elected politician at the head of the government had both been usurped by this new private-public decision-making network, whose values and priorities – indeed its survival – boiled down to one thing: the survival and growth of the conglomerates now that the war, with its infinite demand for stuff and technologies, was over. Galbraith called this nexus the technostructure.
Just as the surplus of food post World War 1 impacted the market for generations, other facets of the economy too found themselves with a profoundly expanded production capacity, one which during the world wars had produced “bullets, machine guns and flame-throwers,” and afterwards “chocolate bars, cars and washing machines.” The reasons for this lie in the necessity for maximized efficiency in competition between technological societies, and therefore more thorough top-down planning in order to more efficiently optimize technological implementation.
It’s worth noting that, while this means Capitalist economies like the US have grown to more closely resemble planned economies like the USSRs, it does not mean that they have arranged this state of affairs out of a deeply held love for Communist theory. But we will save this topic for another chapter…
Regardless of how large or small a part economic liberalization and/or Democracy played in America’s victory in the realm of the material, it’s undeniable that such a victory did indeed happen, and Liberal Democracy in the style of the United States was given the credit. Thus many other countries in places like Southern and Eastern Europe, East Asia, and Latin America, sensing a shift in the winds of the global zeitgeist, followed America’s example in the pursuit of Liberal Democracy.
This could be partially attributed to the application of game theory, as a country whose material development has fallen behind inevitably opens themselves up to be conquered by those more modernized, thus becoming modernized by force instead of voluntarily and losing their agency in the process (with “modernization,” in this case, entailing the adoption of a Liberal Democratic system of governance). On this point, Fukuyama invokes the image of Commodore Perry’s comparatively advanced naval guns “persuading the daimyos in Japan that they had no choice but to open their country up and accept the challenge of foreign competition…” He continues:
The persistence of war and military competition among nations is thus, paradoxically, a great unifier of nations. Even as war leads to their destruction, it forces states to accept modern technological civilization and the social structures that support it. Modern natural science forces itself on man, whether he cares for it or not: most nations do not have the option of rejecting the technological rationalism of modernity if they want to preserve their national autonomy.
General Ishiwara Kanji would have agreed. In his view, the technological progression of his age was leading to an equalization which would render physical confrontation unfruitful and thus undesirable. He states in his lecture on the “Final War”:
The arrival of guns in Tanegashima was the reason why unification of Japan was possible. No matter how great Nobunaga and Hideyoshi were, it would not have happened if they had only spears and bows. Nobunaga understood the times clearly, preached the cause of respecting the Emperor, and made clear the central point of unifying Japan, but he also purchased a large number of guns, effectively laying the foundation for unification.
None of this is bemoaned by Fukuyama as an unfortunate set of circumstances, of course. Rather it was all leading to humanity centering on a system of societal organization that was better than everything which came before:
… Despite the powerful reasons for pessimism given us by our experience in the first half of this century, events in its second half have been pointing in a very different and unexpected direction. As we reach the 1990s, the world as a whole has not revealed new evils, but has gotten better in certain distinct ways.
But, as he also states, “there is no such thing as a dictator who rules purely ‘by force,’” thus the Liberal Democratic system must supply its constituents with a coherent guiding philosophy as well as tangible results from said philosophy’s implementation in order to underpin its perceived legitimacy. In other words, it must justify its own existence and deliver on its promises, something that its early 20th century rivals of Fascism and Communism had utterly failed in achieving. But in that case, what exactly are the guiding principles through which Liberal Democracy can be defined? Francis Fukuyama supplies us with a few parameters…
Liberalism is defined by Fukuyama as a system of governance that recognizes certain individual freedoms from government control. To aid in this, he sources exposition from James Bryce:
While there can be a wide variety of definitions of fundamental rights, we will use the one contained in Lord Bryce's classic work on democracy, which limits them to three: civil rights, “the exemption from control of the citizen in respect of his person and property”; religious rights, “exemption from control in the expression of religious opinions and the practice of worship”; and what he calls political rights, “exemption from control in matters which do not so plainly affect the welfare of the whole community as to render control necessary,” including the fundamental right of press freedom.
Some astute right-wing readers may already notice some potential practical conundrums rear their heads in this definition (for instance, who exactly determines what affects the welfare of the community to a degree which justified government intervention?), but we shall put those away until the next chapter. As for the “Democratic” aspect of Liberal Democracy, Fukuyama defines that simply as “the right held universally by all citizens to have a share of political power, that is, the right of all citizens to vote and participate in politics.”
Using these definitions, I will lay out a series of “problems” with Fukuyama’s narrative by examining the principles of Liberal Democracy itself, as well as events that have transpired since the publishing of “The End of History” which I believe lay bare both the contradictions and inevitable degenerations of this supposed “old age of humanity.” Through this analysis, we can attempt to determine whether or not Liberal Democracy is truly an end point in civilizational development, or if it is instead a mere transitional phase.
According to Fukuyama, Americans by the end of the 80s had trouble “imagining a world that is radically better than [their] own, or a future that is not essentially democratic and capitalist.” Let us survey the modern global environment to see if this proclamation still holds true today. We will start with what I consider to be the most major turning point for Liberal Democracy in recent history…
Chapter 3: That Thing No One Wants to Talk About
“Do they even cure you?
Or is it just to humor us before we die?”
-Alexisonfire, from “Accidents”
Although many will deny it, the Coronavirus pandemic of 2020 was a breaking point for the legitimacy of Liberal Democracy in the minds of the public. As much as people try to leave the events of that accursed year in the past, it’s difficult to deny that something had shifted in the way Americans and residents of other Liberal Democracies viewed their governments.
But let’s start at the beginning. While the virus that would eventually become known as “COVID-19” made its way across the globe from the point of its conception in Wuhan, China, several United States lawmakers attempted to dissuade the people from panicking, dismissing concerns about potentially infected Chinese visitors as mere bigotry. “... We want people to come to Chinatown,” said California representative Nancy Pelosi in February of that year. “Don't be afraid. Enjoy it all. It's beautiful and there are some good bargains here now, so it's a good time to come.”
The tone, of course, would change when the government declared a national state of emergency the following month. The public was thrown into a total panic, with many questioning what kind of precautions to take: should we wear facemasks? If so, what kind of mask? Should we go outside at all? If we do go outside, should we take off all of our clothes afterwards and set them on fire? What should we do if we get infected?
Despite the clear need for answers, the government found itself at a loss, going back and forth on the issue of facemasks (and it’s still in question exactly how effective the government’s eventual facemask policies were in preventing viral transmission) and imposing seemingly arbitrary “social distancing” guidelines which instructed members of society to maintain a distance of at least six feet between each other. Most devastating of all were the state-wide shutdowns, which forced businesses to close and workers to stay home… Unless, however, you were deemed an “essential worker.” I was one of these individuals and was thus obligated to work my way through the pandemic as if nothing whatsoever had changed.
All of this did irreparable damage to public trust in the so-called “experts” who staffed our government bureaucracies. It seemed that no consensus could be achieved regarding the rules, and the rules which did make their way into public policy were often nonsensical. If the virus really was as terrifyingly deadly as was originally claimed (and it seems to be a nearly unanimous conclusion at this point that it was not), did this mean the lives of the “essential workers” mattered less? Why did they have to sacrifice their bodies while everyone else was able to hide away at home? Moreover, did it even make sense to force people to stay home while they were still leaving for trips to the grocery store? Surely there would be people who were unknowingly infected wandering the breakfast cereal aisles.
But outside of the nonsensical nature of these measures, the sudden authoritarian nature of the government registered to many as a disturbing development. Why, in the so-called “land of the free,” had the government suddenly become powerful enough to force people indoors? Skateparks were infamously filled up with sand and business owners who refused to shut down their operations were arrested. Out of other ostensibly Liberal Democratic countries came videos of people being apprehended by police simply for walking around alone outside, something previously only thought possible under the most totalitarian regimes.
On top of all of this was government surveillance and censorship, using police and drones to monitor everyone’s movements and even applying pressure to social media companies to squash discussion which may be disadvantageous to their project. Speculation on the origin of the virus was highly discouraged. Even though there was a viral research lab (which notably had been receiving grants from the United States federal government) very close to the first recorded area of infection in Wuhan, the “experts” all seemed unanimous in decrying any accusation of foul play or mismanagement on the part of the scientists as “harmful conspiracy theories.”
The dogpile on these theories began almost instantly, with The Lancet publishing a letter signed by 27 different scientists denouncing them in no unclear terms. “The rapid, open, and transparent sharing of data on this outbreak is now being threatened by rumours and misinformation around its origins,” the letter stated. “We stand together to strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin.”
The Biden Whitehouse, which took over the next year, was notoriously keen on maintaining social media censorship around the Coronavirus, including what was eventually known as the “lab leak theory,” with a May 2024 House of Representatives report recounting the efforts as follows:
In Facebook’s February 8, 2021, public statement announcing a change to its content moderation policies, the company noted that it would “remove” several new claims on its platforms, including claims that “COVID-19 is man-made.” That same day, Facebook emailed the Biden White House to alert it that Facebook would be “expanding [its] efforts to remove false claims on Facebook and Instagram about COVID-19…”
In July 2021, when Facebook executive Nick Clegg asked a Facebook employee why the company censored the man-made theory of the SARS-CoV-2 virus, the employee responded: “Because we were under pressure from the [Biden] administration and others to do more… We shouldn’t have done it.”
The US federal government, by the end of 2024, was finally ready to admit that “a lab-related incident involving gain-of-function research is the most likely origin of COVID-19,” and that “current government mechanisms for overseeing this dangerous gain-of-function research are incomplete, severely convoluted, and lack global applicability,” however neither of these statements would come as a surprise to anyone who had been paying attention, and as a result the announcement was greeted with little fanfare.
Keep in mind that no one voted for any of the authoritarian measures outlined above, meaning it was neither Liberal or Democratic in any sense of those words. Considering the facts at hand, how do these events square with Francis Fukuyama’s assessment of Liberal Democracy? Is this something he accounted for? Indeed, Fukuyama notes the concern for health over other moral obligations as a distinctive character of the Liberal Democratic man, stating:
It becomes particularly difficult for people in democratic societies to take questions with real moral content seriously in public life. Morality involves a distinction between better and worse, good and bad, which seems to violate the democratic principle of tolerance. It is for this reason that the last man becomes concerned above all for his own personal health and safety, because it is uncontroversial. In America today, we feel entitled to criticize another person's smoking habits, but not his or her religious beliefs or moral behavior.
If the story of 2020 ended here, then we could potentially chalk everything up to a temporary lapse in sanity and liberty due to modern man’s over-preoccupation with his own health and safety. I personally would not come to such a conclusion, but some might. However, the story becomes far more complicated from here on out.
As with the financial crisis of 2008, one would have expected the central governments to dole out funds in order to keep their ailing economies afloat. And this is exactly what the governments did, mercifully keeping at least a significant portion of small businesses alive (although far from all of them, of course). Yet when it came time for investors to put their money somewhere, it became clear who the true winners of the pandemic were: the tech companies. In his book “Technofeudalism: What Killed Capitalism,” Yanis Varoufakis places the true reign of what he calls “cloud capital,” i.e. the domination of the marketplace by tech firms through the use of monopolistic digital platforms, at the 2020 COVID-19 pandemic. He writes:
While the US economy shed 30 million jobs in a single month, Amazon bucked the trend, appearing to a swathe of Americans as a hybrid of the Red Cross, delivering essential parcels to confined citizens, and Roosevelt’s New Deal, hiring 100,000 extra staff and paying them a couple of extra dollars an hour to boot. True, Big Tech did invest the central bank cash, and it did create new jobs – but the jobs it created were those of cloud proles and the investment was in building up its cloud capital.
Even cloudalist companies that had a bad pandemic, like Uber and Airbnb whose customers were unable to use their services, took the central bank money and invested in more cloud capital as if there were no pandemic. It was the pandemic, with the flood of state money it unleashed, that ushered in the Age of Cloud Capital.
This makes sense, as tech companies, more so than any other industry, were able to weather the pandemic restrictions due to their more flexible nature. Under such restrictions, it was difficult to do business when you had to interact with people face-to-face, but this hindrance didn’t exist when your primary mode of interface was a computer screen. Employees of these firms too found it easier to work from home, while those with more conventional jobs at the time (like yours truly), were stuck trudging to the workplace, masks in hand.
This disparity could not have led to anything but a shift in the market, although I must stress that whether or not that shift was inevitable and simply ushered in at an early date by the pandemic is still up for debate. I am of the opinion that this is indeed the case, but more on this topic later…
As if to follow up the proverbial wrench in the gears with a grenade, Summer of 2020 saw a racial fervor over the alleged police killing of a black man named George Floyd in Minneapolis, which in turn sparked both marches and riots in the streets of all major American cities. All at once, the narrow consensus that the government’s supposed “experts” had reached was also turned on its head, and both lawmakers and unelected bureaucrats alike kneeled in submission to this new righteous cause. But what of the Coronavirus restrictions? Surely hundreds, or occasionally even thousands, of people marching closely in solidarity through the streets had to violate at least a few pandemic precautions.
Keep in mind that protests were not a new phenomenon that year. There was quite a lot of public resistance to the lockdowns, leading to numerous protests across the country (although, in stark contrast to the George Floyd protests, these resulted in no riots that I’m aware of). However, the same government actors and mainstream journalists who would go on to show fervent support for the racial protests (and even the related riots in some cases) either disavowed the comparatively peaceful lockdown protestors or had only begrudgingly permitted them to go about their business.
Needless to say, this was no longer about “health and safety.” In fact, it’s been reported that well over a dozen people (at least) perished in the chaos of the George Floyd riots, and countless more were injured. Several miles of American cityscape was also destroyed due to widespread arson and vandalism. Instead, media outlets described the protests with a much more “moral” character, promoting an ideology known as “Antiracism” and decrying a force of civilizational evil called “Whiteness.”
It is necessary here to note that “Antiracism” is not the “colorblind” absence of racism conventionally ascribed to Liberal Democracies, rather it is essentially the opposite: a philosophy of supposedly benevolent discrimination in favor of those who are labeled members of “marginalized” demographics. Ibram X Kendi’s book “How to Be an Antiracist,” which saw a massive surge in popularity at the time, describes it like this:
The only remedy to negative racist discrimination that produces inequity is positive antiracist discrimination that produces equity. The only remedy to past negative racist discrimination that has produced inequity is present positive antiracist discrimination that produces equity. The only remedy to present negative racist discrimination toward inequity is future positive antiracist discrimination toward equity.
This became a dominant moral lens not only in school and business, but in government as well, doing away with old notions of “equality of opportunity” and replacing them with a more explicitly Leftist understanding of “equality.” Under the previous mindset, one may look at the riots as a hideous orgy of violence that must be stopped at all costs. Under Antiracism, however, it was an expression of righteous fury against an inherently “racist” and “oppressive” system which must be razed to the ground.
But, in that case, what of the pandemic restrictions? Would they be counted under the “oppressive” aspects of the system and accordingly done away with? Well, no. The restrictions stayed, at least on paper. In practice, on the other hand, it was clear that there were unspoken carve-outs in favor of the Leftist protests and riots which were not extended to other situations. The establishment had no logical justification for this, and they did not exert themselves to provide one. “I certainly condemned the anti-lockdown protests at the time, and I’m not condemning the protests now, and I struggle with that,” Catherine Troisi, an infectious-disease epidemiologist at the University of Texas Health Science Center said in a July 2020 New York Times article. “I have a hard time articulating why that is OK.”
Mark Lurie, a professor of epidemiology at Brown University, also weighed in. “Instinctively, many of us in public health feel a strong desire to act against accumulated generations of racial injustice,” he said. “But we have to be honest: A few weeks before, we were criticizing protesters for arguing to open up the economy and saying that was dangerous behavior. I am still grappling with that.”
Even more disturbing was the apparent support that massive corporations showed for the protests and riots. News organizations brought on Marxists to condemn the system in the most fiery of terms and even advocate for theft and vandalism. In the midst of the chaos, National Public Radio infamously brought on a little known Leftist author and gave a puff-piece interview regarding their book defending the practice of looting. “Now, as protests and riots continue to grip cities,” the article reads, “she stakes out a provocative position: that looting is a powerful tool to bring about real, lasting change in society.”
Even fast food restaurants joined in on the frenzy, with a notable example being a McDonald’s interview with a black trans activist, posted to (what was then called) Twitter in June of 2020 with the caption “Black trans women have a very simple message: stop killing us.” When combined with the approval of countless politicians, all of this seemed to indicate that the public-private partnership of the American technostructure was sponsoring a coup against itself; leading a faux rebellion to blow off steam and redirect any anger over the events of 2020 into a spectacular hamster wheel of consumeristic pseudo-activism, and no one benefitted more than the tech giants through which all of this was facilitated.
However, the narrative shifted dramatically once again by the end of the year as the COVID vaccines became available. These vaccines, although having been tested with relatively little rigor, were heavily pushed by the press, government, and corporations. People were threatened with job loss or abandonment by their family for rejecting the shots, international travel was restricted for the unvaccinated, and the government once again applied pressure to social media companies to silence anyone who criticized the new medical treatment or questioned its effectiveness. Gradually a new wave of sadism against those who resisted spread through much of mainstream culture, with an infamous front-page article from Canada’s Toronto Star reading:
If an unvaccinated person catches it from someone who is vaccinated, boohoo, too bad. I have no empathy left for the wilfully unvaccinated. Let them die. I honestly don’t care if they die from COVID. Not even a little bit. Unvaccinated patients do not deserve ICU beds.
It’s not an exaggeration to say that explaining all of the reasons for this confounding series of events would take, at the very least, several volumes dedicated strictly to the task, thus it is outside the scope of this series to do so. Like most supposed grand conspiracies, the culprit is a rat king with many heads, all with different motivations. To point the finger at any singular philosophy as the sole driving force would be unproductive at best and foolish at worst.
Rather I wanted to provide at least a brief outline in order to introduce my first “problem” with Francis Fukuyama’s theory of Liberal Democracy as the “end of history,” namely the contradiction between Democracy and the need for expertise, and to illustrate the tension this contradiction is able to cause in a supposedly advanced society like America’s. I believe this contradiction to be the very root of our post-2020 heightened political turbulence, and the primary driver behind the recent rise in skepticism around Liberal Democracy, even if most people simply do not want to talk about it (the same way they would rather leave 2020 in the past).
Chapter 4: The Problems
“No herbs, no golden rule, no muscle control, no sticking our noses in other people's troubles to forget our own; no hobbies, Taoism, push-ups or contemplation of a lotus. The gadget is, I think, what a lot of people vaguely foresaw as the crowning achievement of civilization: an electronic something-or-other, cheap, easily mass-produced, that can, at the flick of a switch, provide tranquility.”
-From “The Euphio Question” by Kurt Vonnegut
In this chapter, I want to enumerate the various “problems” I have with Francis Fukuyama’s theory of Liberal Democracy as the “end of history.” Some of these issues have been mentioned by Fukuyama himself in his book on the topic, while others have not.
I. The “Expert” Problem
According to the US Bureau of Labor Statistics, the United States federal government employs nearly 3 million individuals in total. A staggering figure, to be sure. For reference, that's over twice the population of the European nation of Estonia, making the federal government’s payroll essentially a nation in itself. Also note that this figure does not include the employees of the countless state and city government offices.
This population is spread out over 15 departments: the Department of Agriculture, Department of Commerce, Department of Defense, Department of Education, Department of Energy, Department of Health and Human Services, Department of Homeland Security, Department of Housing and Urban Development, Department of the Interior, Department of Justice, Department of Labor, Department of State, Department of Transportation, Department of the Treasury, Department of Veterans Affairs. Each of these, in turn, has several sub-divisions, including the six divisions of the US military in the case of the Department of Defense.
No one person could possibly comprehend the full scope of this massive organism we call the “federal government,” let alone every single field it has pried its far-reaching tentacles into, therefore how can we expect members of the general public to reliably make decisions regarding its operations? Moreover, fields like economics, statistics, accounting, technology, war, and health have all become so bewilderingly complex and are changing so rapidly that it would be unreasonable to expect a man to know everything required to govern them. Expecting him to know all the fields at once is the height of absurdity.
To state my point in plain terms: efficient governance by the masses is impossible. However, considering the state does indeed govern despite all of this, it stands to reason that it does so through non-democratic means. Enter: “the expert.” While in some cases elected officials do appoint civil servants to manage the leviathan and labyrinthian web of bureaucracy we are currently subjected to, their decision is still limited to a relatively small group of people with sufficient (although certainly not “complete”) knowledge of their respective field.
I am far from the first to notice this seeming contradiction in the democratic system. Professor Kishio Satomi, in his book “Discovery of Japanese Idealism,” wrote about the disparity between the election of lawmakers and appointment of bureaucratic administrators back in 1924:
As a matter of course, democratic political thought expects special knowledge of politics in the people who are in charge of administration, and this thought gives rise to an apology that bureaucratism does not always indicate that the State is not democratic.
But democracy limits the competency of the House of Peers as far as possible. This gives more power to the representatives of the nation and to the nation in general as far as legislation is concerned. So it is obvious that it recognizes the differences of administration, and ignores these differences in regard to legislation.
Of course, this still assumes that legislation is indeed of a democratic character, which is very much in question. Like all of the other aspects of modern governance, legislation has become a bloated and tangled affair, with the federal government now passing, with alarming regularity, pork-filled bills with pages numbering in the thousands. Are our lawmakers actually sifting through these gargantuan tomes on a regular basis, scrutinizing them with the discerning eye of a wizened sage? I would expect not.
Rather legislation has become a kind of mystical black box, impenetrable to the vast majority of the population and confounding even to those familiar with its inner workings. To map out all of its mechanics, to find out where all of the taxpayer money has gone down to the dollar, would likely take years of intense study. And once this theoretical map has been finished, the mapmaker would survey the field and find it to be entirely different than when he started.
To define this system as a “Democracy” would be like defining a car as “a device with windshield wipers.” Yes, the wipers are there, and may play an important part in the vehicle’s operation, but they do not characterize the operations of the machine as a whole. Another imperfect example: it would be strange to call a hamburger a “salad” simply because it happens to include lettuce as a topping (as the popular joke says: “if you want to teach something to an American, you should begin with ‘imagine a burger…’”).
While I do not think that Francis Fukuyama even came close to sufficiently exploring this contradiction inherent to our system, it would be uncharitable of me to claim that he did not imply it at least in passing. Take, for example, these two passages wherein he outlines potential counterarguments to his theory:
In large, modern nation-states, citizenship for the great mass of people is limited to voting for representatives every few years. Government is distant and impersonal in a system where direct participants in the political process are limited to candidates running for office, and perhaps their campaign staffs and those columnists and editorial writers who make politics their profession. This stands in sharp contrast to the small republics of antiquity which demanded the active participation of virtually all citizens in the life of the community, from political decision making to military service…
In a large country like America, the duties of citizenship are minimal, and the smallness of the individual when compared to the largeness of the country made the former feel not like his own master at all, but weak and impotent in the face of events he cannot control. Except on the most abstract and theoretical level, then, what sense does it make to say that the people have become their own masters?
This brings us to the next aspect of the issue: if this system is not overtly Democratic, then is it at least “Liberal?” Sure, one could say that it’s automatically illiberal for the common man to have no say in his government, but if that government keeps itself away from the personal affairs of its citizens, wouldn’t that be enough to warrant the descriptor?
This is an interesting line of inquiry but, as evidenced by the previous chapter’s recounting of the events of 2020, it’s not one that we must concern ourselves with, as it should be sufficiently obvious that the government does not keep itself out of personal affairs. While it may be difficult to label the American government a true Totalitarian regime, the federal government has encroached more and more on the activities of its citizens, to the point where much of it has been effectively normalized.
For instance, the existence of PRISM, a massive government surveillance operation that partnered them with many corporations in order to monitor online activity, may have been shocking when it was leaked to the public in 2013, but nowadays such surveillance is viewed as mundane. As much as it pains me to admit, it seems that the whistleblower, Edward Snowden, had turned himself into a fugitive and fled to Russia all for naught. In the end, the federal government got its way regardless.
Countries of a European persuasion, like Germany and the UK, have been worse in terms of personal liberty, with police officers visiting the houses of people accused of posting “offensive” content online to either give them a stern “talking to” or arrest them. The insistence that their “Liberal Democracies” must be defended with draconian limitations on public speech is mind-numbingly absurd, to say the least, but let us not entertain the delusion that the people needed to be convinced of the effectiveness of these policies at any point.
Indeed, it seems many of the most important decisions regarding our societies, for instance war, immigration, and the aforementioned surveillance state, are decided for us rather than by us, making a mockery of the countries’ designation as “liberal democracies” and thus jeopardizing their claims to legitimacy. Any attempt to point out these issues or complain about them is also met with hostility, if not from the government itself (as is the case with the European examples I gave above), then from the government’s propaganda apparatuses.
In Franco Berardi’s 2009 work “After the Future,” he describes the modern “Liberal” state of affairs as follows:
It’s a strange word – “liberalism” – with which we identify the ideology prevalent in the posthuman transition to digital slavery. Liberty is its foundational myth, but the liberty of whom?... In neoliberal rhetoric, the juridical person is free to express oneself, to choose representatives, and be entrepreneurial at the level of politics and the economy. All this is very interesting, only that the person has disappeared, what is left is like an inert object, irrelevant and useless. The person is free, sure. But his time is enslaved. His liberty is a juridical fiction to which nothing in concrete daily life corresponds.
Yet while the power wielded by the typical citizen of the modern Liberal Democracy is slim, it is undeniably there. We can still go out and cast our ballot for a limited selection of candidates and thus shift the overall tone of our country’s legislation efforts. And, if everything eventually goes to shit, we have the option of rebelling and, in a worst-case scenario, committing violence (this is not something which most will come right out and say in polite company, but politics is first and foremost a matter of who wields violence over who). This poses a problem to the state which, as a whole, would prefer to remain permanent and enact its plans into the foreseeable future.
In order to rectify this, the state creates consensus through various means. It can use direct messaging; propaganda from the departments themselves, but this is typically ineffective as most people distrust government messaging by now. Instead, consensus is more often generated via a wide loosely-connected set of institutions that are technically non-governmental, but are usually still connected to the state in some way, e.g. universities, non-profits, think tanks, and news media organizations. Collectively, these have been dubbed “The Cathedral” by blogger Curtis Yarvin, aka “Mencius Moldbug.” This term, in its most basic sense, means “academia and journalism,” but there’s more nuance to it than what appears on the surface. Yarvin states on his Substack blog Gray Mirror:
The mystery of the cathedral is that all the modern world’s legitimate and prestigious intellectual institutions, even though they have no central organizational connection, behave in many ways as if they were a single organizational structure.
Most notably, this pseudo-structure is synoptic: it has one clear doctrine or perspective. It always agrees with itself. Still more puzzlingly, its doctrine is not static; it evolves; this doctrine has a predictable direction of evolution, and the whole structure moves together.
He goes on to point out that this means the institutions are selecting for some common denominator. But what exactly is it? “... there is a market for dominant ideas,” Yarvin says.
A dominant idea is an idea that validates the use of power… A dominant idea is an idea that tends to benefit you and your friends. A dominant idea will be especially popular with your friends and former students in the civil service, because it gives them more work and more power.
As far as the cathedral is concerned, this spells not only dominance for the government and the many “experts” who fill its ranks, but also for the market, from which it derives much of its funding. It’s not controversial at this point to say that large corporations have far more say in how the government works than the average civilian. In fact, this may be one of the few truisms that the politically unengaged may repeat if you asked them for an opinion on their governmental entities.
Also of note is the manner in which consensus is created, with a preference for more feminine or emotional forms of persuasion rather than authoritarian force. Punishment is still meted out, but through indirect means framed as the logical social consequence of a decision. We saw this phenomenon play out in the pandemic-era, with people being shamed for transgressing against new social norms introduced only days earlier, and again with the so-called “racial reckoning” that lasted well after 2020.
If you were on record saying anything against the new orthodoxy, you would inevitably subjected to a myriad of smears and accusations, and were liable to be dragged through the digital panopticon and subsequently disowned from social circles and blacklisted from industries. This puts into perspective a particularly insidious aspect of this “Liberal” form of manipulation: the idea that it is not enough to simply obey the laws, they must also beat you down mentally until you agree with them. The masters of our society, whether they be of a governmental or corporate nature (and there is, as we will learn, increasingly less difference between the two), want not only your body, but your mind.
The joint effort between the government and corporate America to maximize economic growth, and by extension to mold the public via the above described consensus-building apparatuses, has been named “Neoliberalism” by many on both the political left and right. In “After the Future,” Berardi explains Neoliberalism like this:
What neoliberalism supported in the long run was not the free market, but monopoly. While the market was idealized as a free space where knowledge, expertise and creativity meet, reality showed that the big groups of command operate in a way that far from being libertarian, introduce technological automatisms, imposing themselves with the power of the media or money, and finally shamelessly robbing the mass of shareholders and cognitive laborers.
This “monopoly” does not merely extend to a particular industry, but rather the whole of existence as corporations have increasingly more sway over the economy, government, and the minds of the citizens. In this sense, it is not only government overreach that impinges on the liberty of modern man, but also our economic institutions, which demand more and more compliance and physical/mental labor from the average worker/consumer.
That is not to say that the citizen is a passive actor in all of this. I believe the powerlessness of the average man has gradually taken hold over the public consciousness and has led to some strange events and behaviors over the past decade. For instance, the January 6th, 2021 riot at the United States Capitol Building after the supposed triumph of Joseph Biden in the previous year’s presidential election reflected the public’s growing distrust of America’s electoral system and “experts.” Even the narrow choice given to us by America’s so-called Democracy was increasingly seen as a sham, and the riot was an expression of this.
And let us not assume that experts haven’t suffered some losses. Donald Trump’s subsequent re-election in 2024 can be viewed as a rebellion of the electorate against the tyranny of the unelected American bureaucracy outlined in this chapter, as Trump campaigned specifically on the prospect of firing many of the unelected bureaucrats using the slogan “drain the swamp.” Although how much Mr. Trump will be able to change during his second term in office, or even how much he will be willing to change, still remains to be seen. There are good reasons to believe that any gains Americans do happen to make against the machine will only be fleeting, reasons we will cover in due time.
Another effect that this disenfranchisement has had on the psyche of the public has been the intensification around political discussion, as well as the politicization of everything in one’s personal life. Many have become deranged in their political preoccupation, to a nearly mystical degree, decorating their houses and cars with stickers listing off political causes as if they were magical totems and amulets. They abandon and disown family members for differences of opinion over said causes while filling their friend groups with only those who are willing to march in lockstep.
Ironically, many of the aforementioned causes are not existential ones, rather they increasingly involve (at least from the view of the past) petty identities carved out in the niches allowed by our societal order, whether they be “LGBTQIA+,” black ethno-narcissists, or pornography enthusiasts. We have even seen the not-insignificant resurgence of the (as Francis Fukuyama has pointed out) thoroughly disgraced “theory” of Communism, although not in any form which can be said to resemble Marxism or Stalinism in any meaningful way, but more on that subject later…
Given all we have discussed so far, it’s clear that the label of “Liberal Democracy” no longer fits just right. Like a child growing out of his previous wardrobe, that particular shirt is now far too uncomfortable to wear. Instead, what we’re witnessing is the evolution of something different, and perhaps altogether stranger.
But before we go into detail there, I want to outline a few more issues with Fukuyama’s theory, starting with the problem of “the environment.”
II. The “Environmental” Problem
The problem of the environment will likely go down as one of the most criminally understated issues of our time. Take, for instance, the subject of single-use plastics; plastics that are made to serve one purpose alone before being thrown out. These are usually non-recyclable and are non biodegradable, meaning they stick around for a very, very, very long time. “Every year, the world produces nearly 400 million tons of plastic, a 19,000% increase from 1950,” claims a 2024 article by Yvaine Ye for Colorado University’s news outlet Boulder Today. She continues:
The amount is forecast to double by 2050 and 90% is never recycled. Over half of the plastics produced are used only once, for things like packaging, utensils and straws.
“A lot of people have a hard time imagining that,” said Phaedra Pezzullo, associate professor in the Department of Communication at CU Boulder. “But we produce an astronomical amount of plastics every day. Most plastic bags are used for less than 12 minutes, but they last on the planet for hundreds of years.”
According to the Natural Resources Defense Council, more than nine billion tons of plastic has been brought into existence since the 1950s, with over half of that having been produced since 2000. “Certain uses for plastic are not only reasonable but important, such as surgical gloves,” writes Courtney Lindwall on the organization’s website. “But these cases make up a small fraction of single-use plastic. More than half of non-fiber plastic, which excludes synthetic fabrics like polyester and nylon, comes from plastic packaging alone, much of which is for single-use items.”
The high prevalence of plastics in our environment has led to the emergence of other problems, like that of “microplastics”: small bits of plastic that measure under 5-millimeters long. These have been found in many places throughout the human body, including (as was recently revealed to the shock of internet denizens) human semen and follicular fluid. CNN reports in a July 2025 article:
A small group of 25 women and 18 men participated in the research, published Tuesday in the journal Human Reproduction. Microplastics were detected in 69% of the follicular fluid samples and 55% of the seminal fluid samples. Follicular fluid is the liquid that surrounds an egg in an ovarian follicle…
“Previous studies had already suggested this possibility, so the presence of microplastics in the human reproductive system is not entirely unexpected,” said lead research author Dr. Emilio Gómez-Sánchez, director of the assisted reproduction laboratory at Next Fertility Murcia in Spain, in a statement provided to the press. “What did surprise us, however, is how widespread it is. This is not an isolated finding — it appears to be quite common.”
Keep in mind that we do not currently know the ramifications of this, but it is nonetheless alarming, especially considering the decline in human fertility that we’ve witnessed over the past few generations. But even more concerning, however, is another 2025 study which confirmed “the presence of MNPs [microplastics and nanoplastics] in human kidney, liver and brain.” It elaborates:
MNPs in these organs primarily consist of polyethylene, with lesser but significant concentrations of other polymers. Brain tissues harbor higher proportions of polyethylene compared to the composition of the plastics in liver or kidney, and electron microscopy verified the nature of the isolated brain MNPs, which present largely as nanoscale shard-like fragments.
The effects of this too are not known, but the study does mention that “even greater accumulation of MNPs was observed in a cohort of decedent brains with documented dementia diagnosis,” the implication of which is staggering, to say the least.
Regardless, the fact is that we are altering the environment of our planet for the foreseeable future simply through the use of plastics alone. This does not even take into account other massive issues like deforestation, overfishing, and the hideousness of factory farming, all of which are well known to the general public but ultimately ignored. It’s no exaggeration to say that an in-depth exposition of all of these topics would produce multiple volumes in itself, thus such an endeavor is well outside the scope of this series.
The lack of action we currently see is not due to our institutions being unaware of the environmental harm inflicted by our industrial and postindustrial societies (we actually know quite a lot about how we’ve been changing the planet), rather there is a general lack of initiative primarily for reasons of efficiency.
A corporate entity’s duty is, first and foremost, that of production. They manufacture goods or provide services and make sure these are accessible for the end-user. They do not, however, need to consider what happens afterwards, as this would be a burden on the corporation and would thus interfere with optimal production capacities. For example: the “end of life” for a product is very rarely taken into account and, if it ever is, we are provided with very minimal measures (like recycling symbols on plastic products).
It is often said that environmental concern can be chalked up to mere profit motive, but while there is indeed some truth to this, the problems we are currently seeing are well beyond the scope of what corporations are capable of solving. These are entities created for the purpose of ever-increasing production. To task them with fixing the environmental problems that they have contributed to would require a complete restructuring of what we know as the corporate animal, and thus the dissolution of our entire hyper-consumerist society. It is, to put the matter simply, a fundamental problem, one that cuts to the very foundation of technological society.
We could, of course, appeal to the government to solve these issues, and many activists do, but what can they really accomplish? The government is, in essence, built on the same fundamental beliefs and values which have led to modern Corporatism. The assumptions behind how corporations operate are the same that the government operates under, i.e. it is always good to optimize the processes of production and implement technologies as soon as they are made available. It was, after all, this ability that led to the triumph of our supposed Liberal Democracy over regimes like the USSR, so the loss of it would inevitably result in a crisis of legitimacy. Is the government really doing its job if it is not giving us a global competitive advantage?
One could argue that the government represents the collective “will of the people,” and thus has a moral obligation to act, but who decides what constitutes this supposed “will?” This issue was certainly never put to a vote, and even if it was, how much would the government actually be willing to do? Most likely the bare minimum to avoid having to shoulder any responsibility, which would be very little. “Responsibility” in matters like these is very diffuse in our neoliberal system, after all; not located in any central location that can be reliably pointed to. The buck can always be passed off to the next chump.
Besides, it could be said that “doing nothing” actually is the “will of the people.” Consider: would the public even want the measures required to combat the harm we’ve inflicted on our environment to be undertaken? I implore you to ponder this inquiry deeply. It seems highly probable that any severe contraction of the economy, even in the pursuit of objectively noble goals like “saving the planet,” would be deeply unpopular with the nation’s constituents, and would thus result in a devastating democratic defeat. Even more unpopular would be the rollback of any technological and/or societal progress made in reaching our current state of affairs. It may be easier for people to deal with the loss of lives than it would the loss of perceived “progress.”
“But what of the experts?” one may ask. “If they have so much sway over the government and public opinion, then surely they could help solve these problems.” This line of reasoning has been adopted by many in left-leaning political spheres, and it indeed makes sense. After all, it is the experts who oversee technological implementation, and it is also primarily the “expert class” who champions the NGOs pushing awareness of these issues and presses for more corporate initiatives around “sustainability” and the like. Surely if anyone was positioned to make an impact in this field, it is them.
There is, however, a catch. The experts are only considered “experts” within the pre-existing system. To fully convey the magnitude of our environmental impact and the social reorganization which would need to be enacted in order to rectify it would be to undermine their own credentials, nay their very raison d’etre. Therefore the most sensible path for the expert to take is that of half-measures. We’ve seen this with the World Economic Forum’s efforts to engineer solutions through corporate apparatuses; efforts that are now forever associated with the line “you will own nothing and be happy.”
This line is derived from a 2016 essay from Danish parliamentarian Ida Auken entitled “Welcome to 2030. I own nothing, have no privacy, and life has never been better.” In the text, she outlines her vision of a future where property has been abolished and all resources are free, with most work being done by robots and artificial intelligence. It’s worth noting that the essay does not entirely match up with the WEF’s actual initiatives, but the recontextualization of the phrase “you will own nothing and be happy” to fit the context of technological megacorp dictatorship (a far more likely future than Auken’s utopian meanderings, if our current trajectory is to be considered) was inevitable.
All in all, it may still very well be the case that the “experts” are the best positioned to ameliorate environmental catastrophe. After all, they have the ear of both government officials and corporate executives and have thus been able to enact moderate changes to the behaviors of their respective institutions. Yet their success has been quite small. That doesn’t answer the question of whether or not we would even want them to reshape our society, of course. As G. K. Chesterton warned in his book “Orthodoxy”:
The virtues have gone mad because they have been isolated from each other and are wandering alone. Thus some scientists care for truth; and their truth is pitiless. Thus some humanitarians only care for pity; and their pity (I am sorry to say) is often untruthful.
This statement has even more terrifying implications when we take into account the topics discussed later in this book, but let us put them aside for now.
Lastly, we arrive at the subject of the activists, who were characterized by Francis Fukuyama in his book “The End of History and the Last Man” as the saving grace of the Liberal Democratic system regarding problems of the environment. He writes:
As a whole, democratic political systems reacted much more quickly to the growth of ecological consciousness in the 1960s and 70s than did the world's dictatorships. For without a political system that permits local communities to protest the siting of a highly toxic chemical plant in the middle of their communities, without freedom for watchdog organizations to monitor the behavior of companies and enterprises, without a national political leadership sufficiently sensitized that it is willing to devote substantial resources to protect the environment, a nation ends up with disasters like Chernobyl, or the desiccation of the Aral Sea, or an infant mortality rate in Krakow that is four times the already high Polish national average, or a 70 percent rate of miscarriages in Western Bohemia.
Ironically, it is the activism class which has been the most thoroughly delegitimized in the eyes of the public since the publication of Fukuyama’s book, having found themselves not only impotent but having their image overwhelmingly characterized by ridiculous stunts and public nuisance. As an example, we can take a look at the activist group “Just Stop Oil’s” campaign to raise awareness by throwing soup onto famous paintings in museums. In a video taken by Just Stop Oil, their activists said of those arrested for their participation in the stunt: “Future generations will regard these prisoners of conscience to be on the right side of history” (yep. There it is).
The Unabomber, Ted Kaczynski, pointed out in his manifesto that Leftist activists tend to have a certain “masochistic” bent, writing:
Leftists protest by lying down in front of vehicles, they intentionally provoke police or racists to abuse them, etc. These tactics may often be effective, but many leftists use them not as a means to an end but because they PREFER masochistic tactics. Self-hatred is a leftist trait.
This seems to ring true, and has been to their detriment in these particular cases. However, more baffling to me is their fixation on causes that, under the current paradigm, are doomed to fail, like the idea of ceasing all uses of oil in order to prevent further alterations to the Earth’s atmosphere. Without the implementation of a proper replacement for said oil, there is no world in which developed countries will willingly surrender the use of fossil fuels, not only because it would be massively unpopular with the populace, but also because it would put them at a massive competitive disadvantage on the global stage. But maybe the impossibility is the point. Perhaps it’s merely the martyrdom that they seek instead of true societal change.
This is not to say such dramatic change is entirely impossible, but it is not so simple as making demands to the government and corporations, rather the very foundations of modernity would need to be reassessed and new civilizational engines forged from entirely different outlooks. Needless to say, the current crop of dissidents, particularly those on the left, are not able to make this happen, and have even, to a large degree, adopted the foundational aspects of modernity as permanent fixtures, whether knowingly or unknowingly.
This phenomenon has not gone unnoticed by Leftist intellectuals. For instance Mark Fisher, in his 2009 book “Capitalist Realism,” bemoans the fact that “protests have formed a kind of carnivalesque background noise to capitalist realism, and the anti-capitalist protests share rather too much with hyper-corporate events like 2005's Live 8, with their exorbitant demands that politicians legislate away poverty.” This gives activist causes a sort of unreal, or “hyperreal,” quality, using phrases and symbols associated with the causes to funnel any will to change our circumstances back towards consumeristic impulses, thereby further exacerbating the original problems. He later connects this with environmentalist, or “green,” activism, writing:
At one level, to be sure, it might look as if Green issues are very far from being ‘unrepresentable voids’ for capitalist culture. Climate change and the threat of resource-depletion are not being repressed so much as incorporated into advertising and marketing… Yet environmental catastrophe features in late capitalist culture only as a kind of simulacra, its real implications for capitalism too traumatic to be assimilated into the system. The significance of Green critiques is that they suggest that, far from being the only viable political-economic system, capitalism is in fact primed to destroy the entire human environment.
This is yet another phenomenon that doesn’t appear to make sense through the lens of pure logic, yet it nonetheless really happens. This will be a common theme in this work, as dealing with the complexities of modern society inevitably entails the scrutiny of outwardly nonsensical beliefs and behaviors which work to hide certain uncomfortable truths just below the surface.
In any case, the issue of environmentalism is one that the Liberal Democracy championed by Francis Fukuyama has utterly failed to address, and this will only become more inescapable in time.
III. The “Equality” Problem
Marx once wrote in “Capital” that “the realm of freedom actually begins only where labour which is determined by necessity and mundane considerations ceases.” In “The End of History and The Last Man,” Francis Fukuyama interpreted this as follows:
The Marxist realm of freedom is, in effect, the four-hour working day: that is, a society so productive that man's labor in the morning can satisfy all of his natural needs and those of his family and fellows, leaving him the afternoon and evening to be a hunter, or a poet, or a critic.
Of course, as we covered back in Part 2, the attempted application of Marxist theory in the structuring of human society has never led to this promised freedom. Instead, we saw a rise in the standard of living for those in capitalist countries, while those living in ostensibly Communist ones floundered. Fukuyama continues:
… if the "necessary labor time" required to satisfy basic physical needs was four hours on average for workers in socialist societies, it was on the order of an hour or two for corresponding capitalist societies, and the six or seven hours of "surplus labor" time that rounded out the working day did not go only into the pockets of capitalists, but allowed workers to buy cars and washing machines, barbecues and campers. Whether this constituted a "realm of freedom" in any meaningful sense was another matter, but an American worker was far more fully liberated from the "realm of necessity" than his Soviet counterpart.
On this point, it is difficult to argue that Fukuyama is incorrect. An often-stated irony of American life is that our poor can be quite fat. It’s evident that, for the majority of Americans, our most basic needs are, at the very least, accounted for. But are people happy with this state of affairs? The fact that Gallup reported “new heights” for rates of depression in 2023 suggests that this is not the case. They write in their article on the topic:
The percentage of U.S. adults who report having been diagnosed with depression at some point in their lifetime has reached 29.0%, nearly 10 percentage points higher than in 2015. The percentage of Americans who currently have or are being treated for depression has also increased, to 17.8%, up about seven points over the same period. Both rates are the highest recorded by Gallup since it began measuring depression using the current form of data collection in 2015.
In our age, there is a tendency to individualize mental illness as much as possible. When it is not attributed to some kind of “chemical imbalance” of the brain, then it is always blamed on the circumstances of one's personal life. While I cannot deny that either of these are common causes of afflictions like depression, the dramatic increase over the years implies an obvious society-wide environmental factor that is hardly ever properly accounted for. Not that it is never noticed, mind you (many people seem at least vaguely aware of the phenomenon), but there can never be a solution for such a thing within the confines of psychology, so it goes unaddressed.
Granted, we can’t assume that this era of mass depression is entirely due to the conditions of neoliberal modernity, especially considering that 2023 was only a few years after the Coronavirus pandemic (which was discussed at length back in Part 3). However, the Gallup article admits that the depression rate had already been “slowly rising in the U.S. prior to the COVID-19 pandemic,” meaning the pandemic was far from the sole cause.
I would argue that the quantitatively-oriented nature of our technological modernity has been one of the prime driving forces in all of this. Even though, as Fukuyama noted, our basic needs are more easily met these days, the length of the work day does not seem to be diminishing, and expectations around work are still quite taxing, both mentally and physically. Our social and private lives are too often consumed by economic matters, to the point where even the idea of formulating a healthy “work-life balance” becomes a farce.
In fact, “work” has dominated our lives to such a degree that a new term was coined to help properly illustrate it: 4HL, the “4-hour Life.” The man who popularized this term, Paul Skallas, laid out the details of the “4HL” on his blog “The Lindy Newsletter”:
A modern employee, let’s use a white collar office worker as an example, will usually get around 8 hours of sleep, will work around 8 hours a day, spend 4 hours commuting going to the gym, having meals, and then end up with 4 hours to him or herself during the day. So they own 4 hours. We can call this class of people 4HLers. It may seem vulgar to reduce a person’s life to the amount of hours he has free in a day, but then again, how many hours you have to yourself is kind of a big deal. Time is an important concept to a 4HLer. He is obsessed with time. He has to be. His livelyhood [sic] depends on it. So he has an alarm clock, because he has to wake up at the same time everyday, he has to take lunch around the same time everyday and get off of work around the same time everyday. He has deadlines that are due and he has to be reliable. It’s his job to be reliable.
Life in the 4HL is contrasted with life in the “12HL,” which consists of individuals who are self-employed and can decide for themselves what they do with their time. Thus inequality is not necessarily restricted to the realm of material goods and the satisfaction of bodily needs, but rather freedom of time; a temporal inequality, if you will.
This is not a new observation, however. In Franco Berardi’s “After the Future,” written back in 2009, he attributes the lengthening of the workday to the switch from physical labor to “cognitive labor” as the economy became more informationally oriented in order to perpetually increase levels of production:
Cognitive labor became the leading sector of global production, taking the shape of the economy-driven ideology and identifying itself with the entrepreneurial function, participating at the forefront of mass financial capitalism (dotcom-mania). In the meantime the length of the working day, which had been in decline up until the end of the 1970s, began to increase again after the world victory of liberalism. The free time gained through a century of workers’ struggle was progressively subsumed to the rule of profit and transformed into fragmented and diffused labor. Social energies were progressively subsumed to economic competition. Those who didn’t run got run over. Society started running frantically and many broke down.
Berardi explains that “cyberspace,” which is the perpetually-expanding technological infosphere that the modern economy has made its domain, is theoretically unlimited. On the other hand, what he calls “cybertime,” or the social attention of the human brain (which perceives reality chronologically, i.e. in units of time), is quite finite. Due to this limitation, it was necessary to make “time” the primary commodity of the information age. Thus “capital no longer recruits people, but buys packets of time, separated from their interchangeable and occasional bearers,” Berardi states.
[Note: I also wrote about this topic in Chapter 3 of my book “The Mad Laughing God,” which you can read here.]
Francis Fukuyama also foreshadowed these developments (two decades before Berardi outlined their effects) with a passage in “The End of History” which reads as follows:
Technological innovation and the highly complex division of labor has created a tremendous increase in the demand for technical knowledge at all levels in the economy, and consequently for people who — to put it crudely — think rather than do. This includes not only scientists and engineers, but all of the structures that support them, like public schools, universities, and the communications industry. The higher “information” content of modern economic production is reflected in the rise of the service sector— professionals, managers, office workers, people involved in trade, marketing, and finance, as well as government workers and health care providers — at the expense of “traditional” manufacturing occupations.
But there are potentially dire consequences to this economic structure, as the demand for cybertime outweighs the supply, meaning cybertime has cyberspace forced onto it, being essentially colonized by the realm of economic production and wearing down under the strain. Berardi continues:
But cybertime (the time of attention, memory, and imagination) cannot be speeded up beyond a limit. Otherwise it cracks. And it is actually cracking, collapsing under the stress of hyperproductivity. An epidemic of panic is spreading throughout the circuits of the social brain. An epidemic of depression is following the outbreak of panic. The current crisis of the new economy has to be seen as a consequence of this nervous breakdown.
The encroachment of cyberspace manifests in many ways. Advertising is the most apparent of these, plastering its bright colors and hollow smiling faces across miles and miles of cityscape and interrupting all forms of entertainment by blastic saccharine ukulele music. As we speak, corporations are trying to figure out how to use drones to project advertisements onto the sky. If a method is invented to reliably transmit advertisements into the dreams of individuals, expect it to be utilized to the fullest extent. We can’t let those 8 hours stay unproductive, can we?
The smartphone has also essentially brought the office to everyone’s pocket; now they have no excuse for not monitoring their email and being on-call 24 hours a day. You are always at work, even when you’re asleep in your bedroom, a perpetual metaphysical “wagie cage.” This is in part due to the integral nature of “communication” in the modern informationally-dense “post-Fordist” work paradigm, as Mark Fisher writes in his book "Capitalist Realism”:
The Fordist factory was crudely divided into blue and white collar work, with the different types of labor physically delimited by the structure of the building itself. Laboring in noisy environments, watched over by managers and supervisors, workers had access to language only in their breaks, in the toilet, at the end of the working day, or when they were engaged in sabotage, because communication interrupted production. But in post-Fordism, when the assembly line becomes a 'flux of information', people work by communicating. As Norbert Wiener taught, communication and control entail one another.
Work and life become inseparable.
Social media as well is a product meant to generate profit for its company through both the sale of advertising space/time as well as the harvesting of user data, and many users also utilize it in gaining traction for their products, thus social media usage is a kind of commodity. This is not a blanket condemnation of all of the individuals involved in these practices (if I were to do so, I must include myself in said condemnation, as no one would read my writing without my promotional efforts on social media), but merely a statement of how things work.
With all of this essentially unpaid cognitive labor in consideration, the 4-hour Life begins to look more like a 1-hour Life, if that. There is, however, another component to inequality in our current age: that being the seeming abandonment of big goals like raising a family or owning a house in favor of bursts of hyper-consumerism via the purchasing of collectibles or designer goods.
Due to the steep increase in home prices we’ve experienced across many developed countries, young adults are feeling more discouraged than ever in terms of home ownership. Moreover, romantic relationships can be more complicated and difficult thanks to the various new anxieties and expectations injected into the social milieu via the introduction of the digital realm and various new economic woes.
The method that Millennials and Generation Z seem to have developed in order to cope with these circumstances is to buy small luxuries in accordance with their consumption-based identities rather than save up for things that feel unattainable from their vantage point (we will explore “consumption-based identities” in a later chapter). This practice is sometimes known as “doom-spending,” which an article from the site Robb Report called “Young People Are ‘Doom Spending’ on Luxury Goods. Here’s Why” explained like this:
While historically, trends show that when economic times get tough, people save more, younger generations are flipping that on its head. With a high cost of living, hefty student-loan debt, and a tough labor market, many people don’t think they’ll ever be able to buy property, have kids, or retire with a loaded bank account. Since that all feels out of reach, the money that may have been put toward those traditional markers of adulthood is simply being spent now.
As the above passage suggests, this situation has functionally locked many young adults out of what has been, for several generations, conventionally viewed as an “adult life.” This is paired with and further exacerbated by what Mark Fisher refers to as “depressive hedonia,” which has spread through the young adults of the developed world. He explains:
Depression is usually characterized as a state of anhedonia, but the condition I'm referring to is constituted not by an inability to get pleasure so much as it by an inability to do anything else except pursue pleasure. There is a sense that ‘something is missing’ — but no appreciation that this mysterious, missing enjoyment can only be accessed beyond the pleasure principle.
But ironically enough, it is not this state of affairs that most political protests rail against. As referenced previously, much more political activism is now either focused on very hyperreal issues of primarily symbolic importance, particularly those matters of “identity.” The most visible and well-monied of the factions involved in this kind of activism is the “Pride” movement, recognizable via their frequent use of the rainbow “Pride flag” and its seemingly unlimited variations.
Up until a decade ago, the primary aim of the Pride movement was the achievement of “marriage equality,” i.e. the right for homosexuals to marry each other. Yet once this matter was essentially settled through the supreme court’s Obergefell v. Hodges decision in 2015, the “Pride” movement did not disperse. If anything, it only became more active in subsequent years, raking in money from countless organizations for the purpose of holding “pride parades” every June (which, for some baffling reason, is now called “Pride month” by governmental and corporate institutions across the country).
The primary driving force behind Pride’s intensification has been the adoption of “transgenderism” as a political cause. The formalization of “transgenderism” can be traced back to Germany’s pre-war Weimer republic, in particular “The Institute for Sexual Research” (Institut für Sexualwissenschaft) in Berlin, headed by Magnus Hirschfeld. A 2021 article for Scientific American entitled “The Forgotten History of the World's First Trans Clinic” elaborates on the various experiments performed by the institute:
The institute would ultimately house an immense library on sexuality, gathered over many years and including rare books and diagrams and protocols for male-to-female (MTF) surgical transition. In addition to psychiatrists for therapy, he had hired Ludwig Levy-Lenz, a gynecologist. Together, with surgeon Erwin Gohrbandt, they performed male-to-female surgery called Genitalumwandlung — literally, “transformation of genitals.” This occurred in stages: castration, penectomy and vaginoplasty. (The institute treated only trans women at this time; female-to-male phalloplasty would not be practiced until the late 1940s.) Patients would also be prescribed hormone therapy, allowing them to grow natural breasts and softer features.
What the article fails to mention (likely on purpose) is that one of the recipients of this “transformation of genitals” surgery was a man named Einar Wegener, who also went by the alias Lili Elbe. Einar received not only a primitive vaginoplasty through the institute’s efforts, but also an experimental uterine transplant which ultimately led to illness and a prolonged painful death. In spite of these gristly details, the story of Lili Elbe is held up as a triumph for “trans rights” by the Pride movement to this day.
The modern conception of “transgenderism” started as something of an oddity that was closely tied to the realm of mental illness. Under the name “transexualism,” it was categorized in the 1987 third iteration of the “Diagnostic and Statistical Manual of Mental Disorders” (DSM-3) under “disorders usually first evident in infancy, childhood, and adolescence” and described as follows:
The essential features of this disorder are a persistent discomfort and sense of inappropriateness about one's assigned sex in a person who has reached puberty. In addition, there is persistent preoccupation, for at least two years, with getting rid of one's primary and secondary sex characteristics and acquiring the sex characteristics of the other sex. Therefore, the diagnosis is not made if the disturbance is limited to brief periods of stress. Invariably there is the wish to live as a member of the other sex.
The latest revision of the DSM has removed “transexualism” entirely and replaced it with “gender dysphoria.” The purpose of this was to normalize those who attempt to live as the other gender by classifying their feelings of discomfort as the issue rather than the behavior which results from it. At this point, the implied proposal to the general public was that they should treat “transgender” individuals as members of the other sex as a way to ameliorate a form of intense mental agony, something that much of the population seemed agreeable to. Playing pretend at such a level was a small concession to make in order to ease someone’s suffering.
However, the creep of “trans rights” expanded onwards, and eventually this agreement with the public was no longer sufficient. People could no longer pretend that “trans women” were women, they had to actually believe they were women. “Trans women are women” became the new catchphrase parroted throughout social media and countless op-eds, and the idea that gender was “assigned at birth” rather than “observed” was suddenly a hot talking point. “When we say women, that word always includes trans women,” writes the Human Rights Campaign on their website. “There’s no ifs, ands or buts about it. A woman’s gender identity is her innermost concept of being female. A trans woman’s gender identity doesn’t define or caveat her womanhood, it simply describes her journey to womanhood.”
This is further complicated by the simultaneously espoused idea that “trans” individuals do not necessarily have to conform with the gender identity they claim to hold. From the website for the organization “Advocates for Trans Equality”:
Being gender non-conforming means not conforming to gender stereotypes. For example, someone’s clothes, hairstyle, speech patterns, or hobbies might be considered more "feminine" or "masculine" than what's stereotypically associated with their gender… transgender people may be gender non-conforming, or they might conform to gender stereotypes for the gender they live and identify as.
As the earlier quoted passage claims, one’s gender identity is an “innermost concept,” meaning it is a quality that is only to be felt by the “trans” individual in question and expressed in the manner in which they personally feel is appropriate. Thus “gender identity” is reduced to a vague psycho-spiritual state, consisting of “vibes” and impulses thought to be derived from an ill-defined “self.”
This is all turned on its head, however, when the Pride movement discusses its current pet project, that being the validation of “trans” identity and the provision of “gender affirming” medical interventions for children, something that the greater public would have instantly rejected as abominable just a couple decades ago (and very many still do). Whenever this initiative is opposed, the conversation around “transgenderism” instantly reverts back to the subject of mental illness. Rather than assisting the children in the pursuit of a nebulous pseudo-spiritual gender enlightenment, it becomes a matter of supplying them with “life-saving medical care.”
This all points to the seldom stated fact that there was, is, and will be no consensus on what “transgenderism” actually is because the fight for “equality” has become the purpose in and of itself and all rhetoric is inevitably bent in service of it. Francis Fukuyama indeed predicted that people would eventually add new “rights” to their conception of Liberal Democracy, writing in “The End of History”:
… almost all liberal democracies have seen a massive proliferation of new “rights” over the past generation. Not content merely to protect life, liberty, and property, many democracies have also defined rights to privacy, travel, employment, recreation, sexual preference, abortion, childhood, and so on. Needless to say, many of these rights are ambiguous in their social content and mutually contradictory. It is easy to foresee situations in which the basic rights defined by, say, the Declaration of Independence and the Constitution, were seriously abridged by newly minted rights whose aim was a more thoroughgoing equalization of society.
However, what he (understandably) did not know at the time was that the object of these new rights could also be entirely fabricated should individuals feel the need to make it happen. Due to the narrow confines of modernity, with freedom being largely defined through either entrepreneurship or mere choice in consumption, the fight for “transgenderism” has been one of the few socially acceptable alternative means for foisting one’s will onto others. It is not only the sexual and aesthetic aspects of transgenderism which appeal to many of its adherents (although those certainly play a role), but also the idea that they can bend those around them to their whims.
The ability to force others to accept the view of the world that you have created is, in essence, a powerful domination of those individuals; resulting in a kind of gratification that extends well beyond that which can be derived from getting the state to provide you with material goods. It is, of course, a very “hyperreal” domination, mostly reliant on compelling participation in symbolic gestures like the usage of alternative “pronouns,” yet this is quite fitting for such a hyperreal society.
In a sense, the “gender ideology” that emerged over the past couple of decades is a method of influencing others and finding solidarity crafted to fit the internet age, where technological priority is accepted as an unchangeable fact of reality and the power of spectacle reigns supreme. Put more simply: It is a product of our time; a proprietary derangement of the “end of history’s” slow unraveling.
This is not to assume that there are not mentally ill people who honestly struggle with what has been labeled “gender dysphoria,” and I truly sympathize with such individuals, but we must also acknowledge that they have long been left in the dust as the raison d’être of the rainbow egregore, and the idea that this is solely about the public's happiness and mental well being does not hold up to even a modicum of scrutiny.
One effect of the proliferation of the “trans” wing of the Pride movement has been the emergence of several similar groups. For instance, the “non-binary,” who claim to have no gender. What exactly this is supposed to look like in practice has never been properly established, but that’s not the point. It is, in fact, another power game played via a psycho-spiritual concept for the ultimate purpose of self gratification.
Another group is the “Therians”; those who say they have the souls of animals or mythological creatures. We do not have to concern ourselves with arguments for or against this faction, as it is not taken very seriously by the vast majority of the population, but I wanted to acknowledge its existence for the purpose of illustration. It’s also worth noting that the reasons for the proliferation of these groups are not limited to those explained here, and we will explore more of them later on in relation to the problems of “desacralization” and “atomization.”
It may be baffling to some why so many Leftists in modern America focus on the strange psycho-spiritual stuff found in the latter half of this section to the detriment of the more material problems outlined closer to the beginning. This confusion is understandable, as Leftism is ostensibly Materialist in outlook, and often what is currently labeled “leftism” does not at all resemble what the thought of more economically-minded Leftist (i.e. Marx-influenced) thinkers like the ones I have quoted in this work. But this can be reconciled if we view the prevailing modern incarnation of Leftism through Ted Kaczynski’s understanding of it as a psychological phenomenon. He elaborates in his manifesto “Industrial Society and Its Future” as follows:
When we speak of leftists in this article we have in mind mainly socialists, collectivists, “politically correct” types, feminists, gay and disability activists, animal rights activists and the like. But not everyone who is associated with one of these movements is a leftist. What we are trying to get at in discussing leftism is not so much movement or an ideology as a psychological type, or rather a collection of related types.
The psychological profile of the “Leftist,” according to Kaczynski, is characterized by two prominent features: first is a sense of inferiority that is “so ingrained that he cannot conceive of himself as individually strong and valuable.” Second is oversocialization, which is more prominent among highly educated Leftists like those in academia. Oversocialization can be viewed as the product of a societal system that is so restrictive it mentally squeezes individuals to the point of breaking. “For example,” Kaczynski says, “we are not supposed to hate anyone, yet almost everyone hates somebody at some time or other, whether he admits it to himself or not.” Thus the oversocialized person is burned with a morality that cannot be fully adhered to, exacerbating their sense of inferiority and powerlessness.
The combination of these two components leads to Leftists who attempt to gain power over others (usually those they view as strong or superior in some way) by “rebelling” in ways that don’t actually encroach upon the already established values of our society, but instead operate in tandem with them. Kaczynski continues:
The leftist of the oversocialized type tries to get off his psychological leash and assert his autonomy by rebelling. But usually he is not strong enough to rebel against the most basic values of society. Generally speaking, the goals of today’s leftists are NOT in conflict with the accepted morality. On the contrary, the left takes an accepted moral principle, adopts it as its own, and then accuses mainstream society of violating that principle. Examples: racial equality, equality of the sexes, helping poor people, peace as opposed to war, nonviolence generally, freedom of expression, kindness to animals.
More fundamentally, the duty of the individual to serve society and the duty of society to take care of the individual. All these have been deeply rooted values of our society (or at least of its middle and upper classes for a long time. These values are explicitly or implicitly expressed or presupposed in most of the material presented to us by the mainstream communications media and the educational system. Leftists, especially those of the oversocialized type, usually do not rebel against these principles but justify their hostility to society by claiming (with some degree of truth) that society is not living up to these principles.
When we observe modern Leftism through this lens, along with the fixation on hyperreality outlined earlier in this book, it becomes clear why Pride has taken such a prominent position among its ranks, as well as why Leftists were so ready to allow the Marx-influenced framework that we now know as “Wokeness” to develop into what is essentially a form of consumerism (once again, see “The Mad Laughing God,” Part III for more on this). Their complaint is not that the fundamental beliefs and values of our secular hyper-consumeristic civilization are invalid, but rather that society has not done its duty in delivering the promised fruits of these beliefs and values, thus the system’s legitimacy is only partially put into question.
However, lest Francis Fukuyama potentially feel a sense of relief at this revelation, let’s not forget this particular passage from “The End of History”:
While voters in democratic countries may affirm free-market principles in the abstract, they are all too ready to abandon them when their own short-term, economic self-interest is at stake. There is no presumption, in other words, that democratic publics will make economically rational choices, or that economic losers will not use their political power to protect their positions.
If the Leftists described above do end up inflicting damage on the ideological makeup of our nation, it will almost certainly be the Liberal and Democratic aspects that end up on the receiving end of the hammer first. In fact, one manner in which this is already being actualized is through the leftist propensity towards “multi-culturalism,” which in itself is unsustainable, as it introduces demographics who do not share the self-destructive characteristics and outgroup preference outlined earlier.
Such a process inevitably results in the gradual intensification of ethnic competition and enclavism, which in turn erodes the nation’s remaining “Liberal Democratic” values, fostering instead a political environment of asymmetrical competition wherein the liberally socialized legacy demographics are forced to compete with tribes of new arrivals without being able to rely on the camaraderie of their peers. Under such strain, “Liberal Democracy” can only work towards abolishing itself.
So while Fukuyama saw inequality as a “less fundamental” potential contradiction than others, it’s now clear that such a dismissal will only become less tenable as we move forward into the future.
IV. The “Desacralization” problem
In early 2025, the Pew Research Center conducted what they called “The Religious Landscape Study (RLS)” using a survey of 36,908 American adults. In a February article, they highlighted the results as follows:
The first RLS, fielded in 2007, found that 78% of U.S. adults identified as Christians of one sort or another. That number ticked steadily downward in our smaller surveys each year and was pegged at 71% in the second RLS, conducted in 2014.
The latest RLS, fielded over seven months in 2023-24, finds that 62% of U.S. adults identify as Christians. That is a decline of 9 percentage points since 2014 and a 16-point drop since 2007.
While they noted that the decline in religiosity among US adults had either slowed or leveled-off, the levels we have seen in recent years are the lowest in recorded history. This should not be a surprise to anyone who has been paying attention to such matters, and it was a phenomenon that Francis Fukuyama noted several decades ago, at the end of the 80s. One of the major driving forces behind it, in his view, was freedom of belief and the sheer amount of options available to the average person:
[People] are faced with an almost insuperable problem. They have more freedom to choose their beliefs than in perhaps any other society in history: they can become Muslims, Buddhists, theosophists, Hare Krishnas, or followers of Lyndon LaRouche, not to speak of more traditional choices like becoming Catholics or Baptists. But the very variety of choice is bewildering, and those who decide on one path or another do so with an awareness of the myriad other paths not taken.
More than just fancy metaphysical theory, however, religion seeks the pearl of supreme truth in regards to the greater universe and our place within, and it seeks to apply this truth to the way we live our lives. “... We must rid ourselves of any conception which looks upon religion as a function for the fulfilment of men's arbitrary will,” Kishio Satomi writes in his book “Japanese Civilization: Its Significance and Realization.” “Religious faith which is not supported by truth… always results in failure.”
But if this is the case, which religion actually holds the highest truth? Is it possible to explore each and every religion in enough depth to assess such a thing? This is but one factor that has contributed to the large-scale “desacralization” of modern society. One may ask why I use that term instead of “secularization.” The reason is that they are two separate concepts. “Secularization” implies the separation of social norms and governing institutions from the domain of religion, which has already been accomplished for some time.
The “secularized” view of religion as a mere personal endeavor meant to pacify the soul is now the default view. This is not a new or controversial observation. In fact, Professor Satomi heavily criticized this faulty conception of religion in his book “Discovery of Japanese Idealism,” which was published all the way back in 1924:
Most people, first of all, satisfy their appetite, then their carnal desire, then material desire, and, having satisfied these yearnings some of them who belong to a rather superior class listen to the preachings of the way of God to stimulate the desire of sleep. They substitute preachings and religious music for lullabies.
He writes again in “Japanese Civilization”:
Thus, actual life is religion and religion is actual. The depravity of all religions from olden times to the present day has its root in the fallacy of a vague dualism of actual life and religion. Therefore religion is justified in leading and criticizing life in all its aspects. Religion must be woven into actual life, otherwise it would appear to be of no avail.
But think about it: would such a view of religion even be possible in today’s environment? Francis Fukuyama apparently does not believe so, as it would violate the foundational principles of Democratic Liberalism, which he believes to be a sort of terminus in the evolution of civilizational organization. He writes in “The End of History and The Last Man”:
Like nationalism, there is no inherent conflict between religion and liberal democracy, except at the point where religion ceases to be tolerant or egalitarian… Christianity in a certain sense had to abolish itself through a secularization of its goals before liberalism could emerge. The generally accepted agent for this secularization in the West was Protestantism. By making religion a private matter between the Christian and his God, Protestantism eliminated the need for a separate class of priests, and religious intervention into politics more generally. Other religions around the world have lent themselves to a similar process of secularization: Buddhism and Shinto, for example, have confined themselves to a domain of private worship centering around the family.
What Fukuyama fails to note here is that this amounts not only to secularization, but also “desacralization.” For the purposes of this essay series, my use of the latter term indicates not just irreligiosity, but the total inability for the citizen to be fully religious. When public life is removed entirely from the realm of the sacred and religion becomes a mere matter of inner thought rather than a conception of supreme truth through which one governs both themselves and others, it castrates religion (all religion) and renders it harmless before turning instead towards something else; abandoning conventional gods for strange new ones.
One can be as dedicated to their religion as possible in private, but this faith will inevitably be overruled by a secular and morally bankrupt state, which is (supposedly) the representation of the collective will of a people. Kishio Satomi knew this well, writing in the third chapter of “Japanese Civilization”:
The country or the state, of course, is the secondary production of human life because of the order of its origination, but as a matter of fact, with regard to our present civilized world, individual beings are preceded by the country. With regard to the method of salvation, the country must be classed as the unit…
… the country that is moral must take up as her mission the task of the guardianship and espousal of truth, morality and righteousness with all her accumulated power. However religionized a man may be, if the country is not made just, then even the man of righteousness is liable to be obliged to commit a crime in an emergency for the sake of a nation's covetous disposition.
This is the reason why the Edo-era Zen Master Suzuki Shosan was so intent on petitioning the Shogunate to recognize Buddhism as “the truth,” as at the time it was more preoccupied with the far more secular philosophy of Japanese Neo-Confucianism. If the country could recognize his faith as true, then it follows that the people eventually would as well. He is recorded in his book of sayings (“Roankyo”) as stating the following:
I want to propose my way of government, the upholding of Buddhism, formally to the authorities. Heaven hasn't let me do it yet, though. For one thing, the Buddhism which the patriarchs and predecessors have left us at the cost of bloody tears and relentless practice, has fallen to ruin because there's no official decree to support it. Our greatest problem is the way Buddhism's been dropped and gets no outside protection. Because I'm sure Buddhism will never be recognized as the truth unless the government so ordains. My deep desire is to present this proposal as boldly as I can. I'll say, 'I await most steadfastly and humbly your edict, that Buddhism shall be recognized as the truth.’
However, this kind of declaration would be impossible in our technically-inclined and supposedly Liberal Democratic system because it would run counter to its foundational values. In effect, what we are now witnessing is the death of the sacred, caused not only by the removal of the sacred from the view of the public’s collective consciousness and the aforementioned plurality of religious frameworks available to us, but the total demystification wrought by the material preoccupation of modernity; a topic we will explore in more depth in later chapters (this is just a preliminary exposition of this particular set of circumstances).
Fukuyama, adopting Hegel’s understanding of religion, does not see this as a detriment. Instead it is merely a necessary casualty of the unfolding weltgeist. Religion, in his view, was important for its social utility, but does not represent a true expression of the ultimate. It is a tool to be discarded once its usefulness has played itself out. As he writes in “The End of History”:
The world's great religions, according to Hegel, were not true in themselves, but were ideologies which arose out of the particular historical needs of the people who believed in them. Christianity, in particular, was an ideology that grew out of slavery, and whose proclamation of universal equality served the interests of slaves in their own liberation…
According to Hegel, the Christian did not realize that God did not create man, but rather that man had created God. He created God as a kind of projection of the idea of freedom, for in the Christian God we see a being who is the perfect master of himself and of nature. But the Christian then proceeded to enslave himself to this God that he himself created. He reconciled himself to a life of slavery on earth in the belief that he would be redeemed later by God, when in fact he could be his own redeemer.
Yet this presents a problem which Fukuyama finds unable to ignore: that being the issue of motivation. It’s no secret that man is willing to build breathtaking monuments and cathedrals in service of a higher power, but it does not appear that he is willing to do the same for the sake of Liberal Democracy.
If Auguste Comte, often considered the father of Sociology, could not rouse the spiritual vigor of man through his non-theistic “Religion of Humanity,” then it should come as no surprise that a system composed primarily of bureaucratic drudgery would fail even more spectacularly. There is no man who would willingly sacrifice his life for the Bureau of Motor Vehicles. G. K. Chesterton writes on this matter in his book “Heretics”:
In an age of dusty modernity, when beauty was thought of as something barbaric and ugliness as something sensible, [Comte] alone saw that men must always have the sacredness of mummery. He saw that while the brutes have all the useful things, the things that are truly human are the useless ones. He saw the falsehood of that almost universal notion of to-day, the notion that rites and forms are something artificial, additional, and corrupt. Ritual is really much older than thought; it is much simpler and much wilder than thought.
A feeling touching the nature of things does not only make men feel that there are certain proper things to say; it makes them feel that there are certain proper things to do. The more agreeable of these consist of dancing, building temples, and shouting very loud; the less agreeable, of wearing green carnations and burning other philosophers alive. But everywhere the religious dance came before the religious hymn, and man was a ritualist before he could speak. If Comtism had spread the world would have been converted, not by the Comtist philosophy, but by the Comtist calendar.
By discouraging what they conceive to be the weakness of their master, the English Positivists have broken the strength of their religion. A man who has faith must be prepared not only to be a martyr, but to be a fool. It is absurd to say that a man is ready to toil and die for his convictions when he is not even ready to wear a wreath round his head for them. I myself, to take a corpus vile, am very certain that I would not read the works of Comte through for any consideration whatever. But I can easily imagine myself with the greatest enthusiasm lighting a bonfire on Darwin Day.
Regardless, modern civilization’s pervasive desacralization means that Liberal Democracy has been operating on what one could label “legacy social infrastructure” inherited from the religions of prior eras, infrastructure that is now wearing out and breaking down. This potential pitfall did not escape Fukuyama’s notice. He bemoans:
Capitalist prosperity is best promoted by a strong work ethic, which in turn depends on the ghosts of dead religious beliefs, if not those beliefs themselves, or else an irrational commitment to nation or race. Group rather than universal recognition can be a better support for both economic activity and community life, and even if it is ultimately irrational, that irrationality can take a very long time before it undermines the societies that practice it.
I would append to this the issue of societal order, which also deteriorates as religious influence is increasingly dissolved. When religious ideals are instilled in the greater public, there exists a propensity for self-governance that does not exist in a materialistic and atheistic culture. Even if this propensity is only marginal, it can make a significant difference in the aggregate, and without it order can only be established through more heavy-handed means.
More importantly, with the death of the sacred in modern civilization, what is to maintain the sanctity of Liberal and Democratic principles; to stop them from being discarded by those who wield the levers of power once it becomes more efficient or personally advantageous for them to do so? As an illustration, one can see the photos posted to Twitter (now “X”) by Belgium's Minister of Defense, Theo Francken, in 2021 (recently going viral again in September of 2025) depicting a homeless man using the eternal flame of the unknown soldier in Brussels as a stove to cook his food; a monument to valor and sacrifice reduced to mere physical utility. Practical in the strictly material sense, but ultimately robbing the flame of its transcendent importance.
With this transpiring on a global level, should we be surprised when our own societies prefer material expedience to sanctity? Should we feel shocked to find that virtue and righteousness have lost their meaning in the eyes of cold material expedience? Even if Liberal Democracy were actually as uniquely satisfying for the human spirit as Fukuyama insists, its principles would only need to fail a measure of efficiency once before being reassessed as ineffectual and antiquated.
But there is yet another facet of Liberal Democracy’s unraveling that has been dramatically exacerbated by the secularization and desacralization outlined above, that being the “atomization” problem, which we will cover next…
VI. The “Atomization” Problem
I recently stumbled across a news article about a 77-year old woman who moved onto a cruise ship, where she planned to spend the remaining years of her life. The article, from the New York Post, recounts the tale as follows:
Cruising into her golden years.
A California retiree sailed into the next stage of her life when she traded in her home in a retirement community for the open seas, where she’ll reside for the next decade.
… a former high school foreign language teacher, purchased an interior villa aboard the Villa Vie Odyssey, the world’s first perpetual cruise, since she claimed it was cheaper to live at sea than remain in the Golden State.
This story had many social media users aghast. To them, it was a perfect encapsulation of the self-centered “Boomer” view of life as a quest to accrue petty luxuries while leaving nothing for the benefit of subsequent generations. “They destroyed the world and left us nothing,” read one reply with over 14,000 likes. “The Boomer generation and its consequences have been a disaster for the human race,” said another.
But the reactions weren’t all negative. “If she’s happy, why not? Seems too [sic] me that she’s living her best life,” commented a user named Rivkahle. “Nor is it anyone’s place to comment on how she chooses to spend her money.” Another reply was more forceful, stating: “She is spending HER money, it's HER damn right.”
I was among the crowd who felt a sense of disgust upon reading the headline, but I had to ponder why that was the case. In truth, there was nothing about this woman’s plan that went against the sensibilities of modern America, where advertisements parade a phantasmagoric alternate world of luxury, carnal pleasure, and theoretical supreme satisfaction in front of your face 24 hours a day, 7 days a week. In a sense, life on a cruise ship is the ultimate manifestation of this consumeristic American dream; a resort outing full of indulgent consumption, all encapsulated in an entirely man-made environment, lasting forever.
But if we were to turn away from this dream, to spit out the saccharine artificial flavoring in want of something more natural, what does that say about the values of the society we currently inhabit? If life is not the pursuit of decadence or material gain, what exactly are we meant to do with ourselves? There is, after all, no proper “moral judgement” in the realm of Liberal Democracy outside of Liberal Democracy’s own supposed fundamental values, i.e. that of the technical. Older, more traditional moral axes needed to be effectively dissolved to make way for those more amenable to a liberal and capitalistic modernity. As Francis Fukuyama notes:
In place of an organic moral community with its own language of “good and evil,” one had to learn a new set of democratic values: to be “participant,” “rational,” “secular,” “mobile,” “empathetic,” and “tolerant.” These new democratic values were initially not values at all in the sense of defining the final human virtue or good. They were conceived as having a purely instrumental function, habits that one had to acquire if one was to live successfully in a peaceful and prosperous liberal society. It was for this reason that Nietzsche called the state the “coldest of all cold monsters” that destroyed peoples and their cultures by hanging “a thousand appetites” in front of them.
To scoff at values like “tolerance” and “diversity” is to undermine the very foundation of modern American society, and thus ally oneself with “intolerance” and the backwards mindset of ages past. In short, it’s a sin against the modern secular “religion.” And yet, despite being robbed of true belief in the sacred; in a higher morality outside of human laws and social norms, we cannot help but feel disturbed at stories like the one outlined above. Is this a holdover from previous eras, mere sentimentalism that still has yet to be crushed out of us, or perhaps man grasping for something he has lost on the way?
It’s no secret that man has become more atomized than any time in recorded history. This is not solely due to the mass secularization of life (although that has played a very significant role, if not the most significant). It is also the logical outcome of viewing social interaction through the lens of economically-minded liberalization, as Fukuyama describes in “The End of History”:
It is clear that communities held together only by enlightened self-interest have certain weaknesses with respect to those bound by absolute obligations. The family constitutes the most basic level of associational life, but in many ways the most important... But for many Americans, the family, now no longer extended but nuclear, is virtually the only form of associational life or community they know.
Yet even this limited rendition of the “family” is subject to a kind of creeping contractualism, viewing intimate personal relations on the same level as business partnerships.
Many of the problems of the contemporary American family — the high divorce rate, the lack of parental authority, alienation of children, and so on — arise precisely from the fact that it is approached by its members on strictly liberal grounds. That is, when the obligations of family become more than what the contractor bargained for, he or she seeks to abrogate the terms of the contract.
We can see this in the reduction of romantic relationships and marriage essentially to contracts. Marriage is no longer a sacred union of man and woman for the purpose of binding together a family, but rather a matter of mundane government bureaucracy.
Likewise, parenthood is a legal obligation which can be contracted off to professional childcare providers in order to maintain the two-incomes per household required for raising children these days. In recent years, we have even seen the proliferation of “non-monogamous” or “polyamorous” relationships which, in the absence of a profound bond between individuals, are entirely made up of contract-style agreements between parties. Christopher Lasch writes in his 1979 book “The Culture of Narcissism”:
The marriage contract having lost its binding character, couples now find it possible, according to many observers, to ground sexual relations in something more solid than legal compulsion. In short, the growing determination to live for the moment, whatever it may have done to the relations between parents and children, appears to have established the preconditions of a new intimacy between men and women.
This appearance is an illusion. The cult of intimacy conceals a growing despair of finding it. Personal relations crumble under the emotional weight with which they are burdened… The same developments that have weakened the tie between parents and children have also undermined relations between men and women.
Through our “practical” and desacralized lens, a dreadful conclusion inevitably emerges: the idea that family, especially children, and all other forms of rich social connection are not assets but hindrances. They cost time and money that could be spent on more economically productive endeavors, and thus always carry an implied opportunity cost. Moreover, in order to remain competitive, one must be flexible and mobile, aspects which are sabotaged by the baggage of community and family obligation. Francis Fukuyama continues:
Their lives and social connections are more unstable, because the dynamism of capitalist economies means constant shifts in the location and nature of production and therefore work. Under these conditions, it becomes harder for people to put down roots in communities or to establish permanent and lasting ties to fellow workers or neighbors. Individuals must constantly retool for new careers in new cities. The sense of identity provided by regionalism and localism diminishes, and people find themselves retreating into the microscopic world of their families which they carry around with them from place to place like lawn furniture.
Make no mistake, if this mode of thought has saturated society down to its most basic units, then there is no aspect of life that has remained unmolested. Indeed, the “resume” or “profile” has become the most basic measurement of a man’s worth; how valuable one can appear in terms of potential economic benefit.
This is placed into our heads at a very early age; the idea that in order to “succeed” in modern life, one has to follow all of the rules and check all the right boxes. Kids are pressured to get the best grades in their class, participate in after-school activities, go to college, slave away in their twenties, and keep their personal image as squeaky clean and politically correct as possible, work a corporate internship, and then maybe, just maybe, they will acquire the pleasurable and prosperous lifestyle that has been dangled in front of their face by the spectacle of mass media since the moment they were born.
Under our current Materialist paradigm, there is no greater pursuit than this. It is, after all, your only life; your one chance at material prosperity. To throw this away and doom yourself to poverty is essentially a form of eternal perdition. Meanwhile, men like Bill Gates and Steve Jobs are valorized like the great heroes of old; held up as exemplars of the “entrepreneurial success story,” that great heroic myth arc that neoliberal modernity has thoroughly integrated into its foundational mythology. For those who transgress against this order, the path to redemption lies in emulating this entrepreneurial hero's journey by working towards such a success.
The end result of all of this is the “entrepreneurial self”; a view of the “self” that Franco Berardi believes to have gained social prominence in the 1980s. He writes in “After the Future”:
Since the beginning of the 1980s, after the defeat of the working class movements and the affirmation of neoliberal ideology, the idea that we should all be entrepreneurs has gained social recognition. Nobody can conceive his or her own life in a more relaxed and egalitarian manner. S/he who relaxes may very well end up in the streets, in the poorhouse or in jail.
In this sense, man is placed not as one individual among several in a community or family, or even among many in a creed or nation, rather he must learn to view himself as a sole proprietorship which must market itself to the greater world. Everything one does carries weight in regards to his potential productivity and marketability. This phenomenon did not escape the notice of South Korean philosopher Byung-Chul Han, who outlined its devastating atomizing effects in his book “Psychopolitics: Neoliberalism and New Technologies of Power”:
As the entrepreneur of its own self, the neoliberal subject has no capacity for relationships with others that might be free of purpose. Nor do entrepreneurs know what purpose-free friendship would even look like. Originally, being free meant being among friends. 'Freedom' and 'friendship' have the same root in Indo-European languages. Fundamentally, freedom signifies a relationship. A real feeling of freedom occurs only in a fruitful relationship - when being with others brings happiness. But today's neoliberal regime leads to utter isolation; as such, it does not really free us at all.
This view is not synonymous with Consumerism, which is sometimes considered to have originally emerged in the “consumer revolution” and print advertising of the 18th century (see “Consumerism in World History” by Peter N. Stearns) before coming to full fruition in the 20th with the full mobilization of mass media advertising techniques. However, the two work in tandem with one another, as the “entrepreneurial self” is the apparatus through which man is expected to ultimately achieve the “consumerist” lifestyle, and with it the promised supreme material satisfaction of modernity.
But this mindset has, as one would expect, also been thoroughly engrained in modern expectations around work, an arrangement that ironically makes more traditional paths to meaning, like the family, more difficult to attain, thus perpetuating the cycle of desire while pushing individuals instead towards more consumeristic ends. Mark Fisher makes note of this arrangement in “Capitalist Realism,” writing:
The slogan which sums up the new conditions is ‘no long term’. Where formerly workers could acquire a single set of skills and expect to progress upwards through a rigid organizational hierarchy, now they are required to periodically re-skill as they move from institution to institution, from role to role. As the organization of work is decentralized, with lateral networks replacing pyramidal hierarchies, a premium is put on ‘flexibility’. Echoing McCauley's mockery of Hanna in [the movie] Heat (‘How do you expect to keep a marriage?’), [Richard] Sennett emphasizes the intolerable stresses that these conditions of permanent instability put on family life. The values that family life depends upon — obligation, trustworthiness, commitment — are precisely those which are held to be obsolete in the new capitalism.
And so men slave away night and day in the name of countless promised glories, the carrot on the stick driving man, the bio-machinic animal, forward until he collapses from exhaustion or nervous overload. Whenever one gets it into his head that this state of affairs constitutes a grave injustice; perhaps even an albatross weighing down the neck of humanity as retribution for his short-sighted material preoccupation, he is chastised as “selfish,” “lazy,” or “simply not working hard enough,” or otherwise funneled into one of the designated safe ideological channels for dissent (for instance, Critical Theory and its many malformed offspring). Left with no real alternative, he finds himself utterly alone.
Of course, most do not arrive at this point. In fact, they may not even question the basis of modern life in any significant way. Most, you may find, are satisfied enough that they will preserve this system if push comes to shove. We may see cracks emerge on the facade every now and then (Berardi rightfully points out the spikes in suicide and increases in depression as byproducts of this system), but as of the writing of this piece, there is no alternative that will be viewed by the public as legitimate.
This seeming contentment can at least partially be attributed to the fascinating and pernicious ability of modern capitalism to address the vacuums left by the identities, spiritualities, and communities lost to the cold inhuman winds of deterritorialization with marketable plastic replacements… for a cost, of course. This has been accomplished not in a disruptive fashion, but rather gradually and without much notice, with the all-consuming blob of the market absorbing and repurposing anything that may otherwise lie outside its grasp (as I have outlined in my work “The Mad Laughing God”). “This makes capitalism very much like the Thing in John Carpenter's film of the same name,” notes Mark Fisher, “a monstrous, infinitely plastic entity, capable of metabolizing and absorbing anything with which it comes into contact.”
In Sam Binkley’s “Getting Loose: Lifestyle Consumption in the 1970s,” the author recounts how concerns over “self-identity” used to be resolved via “fixed cultural and institutional authorities,” but after these were effectively abolished in the “shaky and changing” world of Liberal modernity, men were forced to seek solutions through “daily improvisations and choices in style of life” instead. Binkley writes:
Where once the moral guidance embodied by religion and the state, or the sense of affiliation derived from class and community membership, was enough to tell each of us who we were, what we should do, and why our lives were meaningful, today such answers are more often gleaned from individual accomplishments in our careers, our relationships, and in the way we choose to live.
Indeed, it is the dissemination of choice in everyday practices (an accomplishment often attributed to our developed culture of consumption) that has so profoundly undermined the stability and permanence proffered by a traditional worldview…
Perhaps the most pure form this ultimately takes is that of the “consumer lifestyle,” which is built on a consumption-based identity. This is described by Binkley as follows:
Consumption and lifestyle have emerged in recent decades as central to the way people imagine themselves to be agents of their own lives and authors of their own reflexive identities — a turn that has insinuated new forms of flexibility, fragmentation, and fluidity into the very fibers of self-identity. Such lives are no longer tests of character or expressions of devotion to long-term goals requiring the control of impulse and postponement of gratification: they are ongoing projects of the person's own doing expressed in myriad tastes, preferences, and consumer choices, mediated by the phantasmagoria of lifestyle imagery.
At first, this can be interpreted as an expression of individuality, a prime value of Liberal Democracy. This is true to a certain extent, as the individual is indeed given the choice of what to purchase between a plethora of goods. However, this is still a narrow view of individualism that is mediated by the market and the technicians who work therein.
It is common to view a consumption-based identity like “punk” or “hipster” (at least implicitly) as an expression of an unchanging “self,” but this is mere delusion, a fact that becomes abundantly clear whenever the winds of the zeitgeist shift and people are led to greener pastures of consumption. Likewise, “fandoms,” or communities built on the shared consumption of products and entertainment media, are contingent on ephemeral product and advertising cycles which will inevitably be traded for the “new thing” once they have run their course. The end result is a world “in flux,” a house built on sand that can’t help but constantly collapse in on itself over and over again.
Perhaps more damaging than the above described examples are the pseudo-spiritualities which have taken the younger generations into their odious grasp; things like the gender spiritualists, “therians,” “twin flames” or “soulmate” cults, quasi-wiccan pagans, groups who have taken to worshiping fictional characters, etc. All of these work to effectively place the spiritual firmly in the realm of fashion and consumerism, tailored to work hand in hand with materialist society while falsely claiming to raise their adherents above it. Rather than directly addressing the material and unifying with it, these new spiritual belief systems place the spiritual on an entirely separate imagined plane that operates simultaneously with the mundane, but never truly interfaces with it in any meaningful way, the end result being that these frameworks serve only vanity and self-glorification.
Too often overlooked is the fact that the antiquity they seek to emulate was a very different world from our increasingly mechanized modernity, and this lack of reflection leads to modern values and assumptions being adopted unquestioned. As a result, they are integrated into the “spiritualities” and unconsciously reinforced, leading to an exacerbation of the problems which they sought to flee from in the first place; a reinforcement of atomizing consumeristic individualism and obscuration of the past. It is all too common for those who attempt to defy the system to ironically become its stalwart defenders, and it can be said that parody of the true and righteous is more dangerous than outright opposition, as will be explored in later chapters.
Through the atomization wrought by liberal economic forces and other modern conditions, modern man finds himself a creature ripped from the continuum of time; not a being from a long line of tradition and cultural evolution (and even if he is, it is only with begrudging acknowledgement due to the problematic nature of the past) nor one who will forge a world to pass on to future humans (as the continuation of life after death is not fully conceivable for the thoroughly desacralized).
Rather man must live “for himself” in the moment, his existence a fruitless quest for the discovery of an unspecified “self,” the expression of which will supposedly solve all of his existential conundrums. Atomized and alone, yet supposedly exuberant in the plastic playpen he so confidently calls “freedom,” and lulled into a sense of complacency by the knowledge that the future has been well and truly canceled. According to Lasch, this amounts to a profound “narcissism,” an importance placed on self-fulfillment and self-gratification that separates the modern man from the flow of history, placing him outside of other generations and viewing the past and future as nothing but seas of ashes. He writes:
To live for the moment is the prevailing passion - to live for yourself, not for your predecessors or posterity. We are fast losing the sense of historical continuity, the sense of belonging to a succession of generations originating in the past and stretching into the future. It is the waning of the sense of historical time - in particular, the erosion of any strong concern for posterity - that distinguishes the spiritual crisis of the seventies from earlier outbreaks of millenarian religion, to which it bears a superficial resemblance.
Through this “narcissistic” materialist lens that values consumption and the maximization of pleasure, the “Boomers” like the woman who moved onto the cruise ship have truly won. They get to enjoy their final days indulging in a kind of excess and luxury that most of us will likely never see. Lives lived outside of time and "to the fullest," one could say, decking themselves out with expensive leather jackets and riding off into the sunset on their shiny motorcycles like in their favorite Hollywood productions.
Should this system of generating satisfaction continue to function indefinitely, perhaps Francis Fukuyama will be vindicated in his assessment that American Liberal Democracy is a model for the societal organization that will take hold of the entire world at the “end of history,” but there is no promise that this will be the case. In fact, the seeming inability for the system to continually deliver on its promises may be one of the contradictions which ultimately undermines the entire thing.
Advertising, a powerful tool of human manipulation for sure, works by generating the alternate ideal reality which human desire makes into its object. However, this phantom ideal is never entirely deliverable, and the products themselves are rapidly becoming less affordable for the average citizen. Thus the core pillars of Consumerism appear more weathered and unstable with each passing year, and the promised fruits less desirable. Just how long can humans run on this hamster wheel before they become tired or the wheel itself comes apart? I highly doubt the answer is “forever.”
VI. The “War” Problem
Fukuyama characterizes Liberal Democracies as “fundamentally un-warlike” in his book on “The End of History,” and claims that this feature “is evident in the extraordinarily peaceful relations they maintain among one another.” As such, the worldwide adoption of his supposed ultimate societal configuration could eventually end the practice of war altogether. General Ishiwara Kanji made a somewhat similar prediction, although in his case it was destructive power that would lock the globe into a stalemate.
In truth, both of these men are partially correct. Since the end of the second world war, there have been no wars which have achieved the same scale and level of devastation, an outcome that can be largely attributed to the rise in stakes associated with the invention of the atomic bomb. However, it appears that America’s appetite for war has not been entirely satiated.
Subsequent to the publishing of “The End of History and The Last Man” at the dawn of the 90s, America launched invasions of two Middle Eastern countries. Afghanistan was invaded in the wake of the 9/11 terrorist attacks and occupied by US forces until their disastrous withdrawal in 2021 (a fiasco that I believe will go down in history as one of America’s most embarrassing moments). US forces invaded Iraq over a year later under what appear to be fraudulent claims that the Iraqi government had been stockpiling “weapons of mass destruction.” The actual reasons for this military intervention have never been sufficiently clarified.
However, since the end of WW2, America has far preferred covert methods of geopolitical manipulation, perfecting techniques for fomenting unrest among the populations of other countries and arming their rebels in order to enact regime change. The alphabet agencies have often been accused of spurring on the “color revolutions” which overtook Eastern European post-soviet countries after the turn of the millenia, but the extent of their involvement is disputed. Regardless, there are instances of meddling which have been generally accepted by historians. For example, the Ronald Reagan administration had been funding Nicaraguan anti-Communist rebels known as the “Contras” while Fukuyama was in the midst of writing his book.
This last detail should come as no surprise to American Leftists who see the intelligence agencies as “reactionary” entities looking to snuff out all Leftist efforts across the globe. Unfortunately for them, it appears that the United States is not too picky with who it supports financially as long as they suspect it may serve their own long-term interests, with its list of beneficiaries allegedly including Cuban dictator Fidel Castro (via the CIA) and, in just the past decade, self-proclaimed Communist revolutionaries in a Kurdish militia in Northeastern Syria. Rolling Stone writes in an article entitled “The Untold Story Of Syria’s Antifa Platoon”:
Though mostly ignored by the mainstream media, Rojava became a celebrated cause to the millennial hard left, the sort of black-clad protesters you might have found at Occupy Wall Street or setting limousines on fire at Trump’s inauguration. At the same time, the YPG became the U.S. military’s closest ally on the ground in Syria; no other faction showed as much willingness and ability to take on the Islamic State and win.
The reason I believe all of this to be a weak argument against Fukuyama’s thesis is that he claims Liberal Democracies engage peacefully with other Liberal Democracies. For Islamic regimes or other flavors of Authoritarianism, he states that war is still very much on the table, writing:
There is by now a substantial body of literature noting the fact that there have been few, if any, instances of one liberal democracy going to war with another. The political scientist Michael Doyle, for example, maintains that in the two hundred or so years that modern liberal democracies have existed, not one single such instance has occurred. Liberal democracies can, of course, fight states that are not liberal democracies, just as the United States fought in the two world wars, Korea, Vietnam, and most recently the Persian Gulf. The gusto with which they fight such wars may even exceed that of traditional monarchies or despotisms. But among each other, liberal democracies manifest little distrust or interest in mutual domination.
The crack in this narrative, however, comes into focus when we look at the predicaments involving Ukraine and Israel. Ukraine, ostensibly a Liberal Democratic country, experienced a revolution in 2014 which resulted in the ousting of the president at the time, Viktor Yanukovych. The fact that the US seems to have been involved in this matter should not be a shock at this point. In fact, in their assessment of these events, the Cato Institute wrote that “the extent of the Obama administration’s meddling in Ukraine’s politics was breathtaking.” They continue:
Russian intelligence intercepted and leaked to the international media a Nuland telephone call in which she and U.S. ambassador to Ukraine Geoffey Pyatt discussed in detail their preferences for specific personnel in a post-Yanukovych government. The U.S‑favored candidates included Arseniy Yatsenyuk, the man who became prime minister once Yanukovych was ousted from power. During the telephone call, Nuland stated enthusiastically that “Yats is the guy” who would do the best job.
Nuland and Pyatt were engaged in such planning at a time when Yanukovych was still Ukraine’s lawful president. It was startling to have diplomatic representatives of a foreign country — and a country that routinely touts the need to respect democratic processes and the sovereignty of other nations — to be scheming about removing an elected government and replacing it with officials meriting U.S. approval.
This shakeup was one of the catalysts leading to the current war between Russia and Ukraine, with Vladamir Putin expressing concern in interviews regarding the involvement of the United States with Russia’s neighbors. The American government, no fans of the Russian state, have been all too happy to supply Ukraine with a seemingly endless supply of weapons and other forms of aid, turning the situation into an apparent proxy war. While it is difficult to label Russia a true “Liberal Democratic state,” this deviation from war in Middle Eastern Islamic countries and encroachment into Europe has made the possibility of war among the “Liberal Democracies” look a little more “real,” so to speak.
Moreover, the constant stream of United States aid to Israel in their war with the militaristic Palestinian organization Hamas, which was initiated following attacks launched by Hamas from the Gaza Strip on October 7th, 2023, has deeply divided Americans. This has hit the American left wing particularly hard, splitting them into factions concerned either about anti-Semitism or about the plight of the Palestinians for the foreseeable future. More importantly, Israel has been characterized by countless American political figures as a Liberal Democratic state, thus their brutal actions in what is essentially an ethnically-charged conflict have shaken American trust in the labels of “Liberal” and “Democratic.”
What happens next in the sphere of war is not entirely clear, but as the wheel of time continues turning onward, the ground underneath the “end of history” only appears less stable, crumbling apart bit by bit. Let us hope, in this case, that it does not collapse.
VII. The “China” Problem
China underwent a Communist revolution under Mao Zedong’s People's Liberation Army in 1947, but by 1990 had become “just another Asian authoritarian state” according to Francis Fukuyama, who writes in “The End of History and The Last Man”:
It lacks internal legitimacy for a broad sector of its own elite, particularly among the young who will someday inherit the country, and is not guided by a coherent ideology.
Yet it appears that ideology does not have to be particularly “coherent” for an authoritarian state to operate under it, as China has been neither liberalized nor democratized since the publication of Fukuyama’s book. If anything, the opposite has been the case, with the Chinese communist Party increasing its surveillance efforts, doubling down on censorship (with the limitation of the Chinese internet and relegation of Chinese citizens to their country’s proprietary social media sites, collectively termed “The Great Firewall of China” by people in the West, being a well known example), and strictly controlling travel into and out of the country.
In fact, the primary area where China seems to be “liberal” is in its manufacturing, which is infamous for terrible quality control (often prioritizing quantity over quality) and hideous working conditions. This liberalization, however, has led to the country making up an estimated 29% of all global manufacturing as of 2023, dwarfing the output of The United States. This has bestowed upon the (ironically) Communist nation a great deal of leverage to wield in the international political arena. Global Culture as well has not remained untouched by Chinese influence, as many of our consumer trends these days are centered around goods produced in Chinese factories.
Indeed, China’s rise to superpower status has shown us that a country which rejects many of the fundamental values of Liberal Democracy can still succeed at this stage of history, throwing a wrench into Fukuyama’s theory. But alas, I cannot claim total credit for this revelation, as it was a possibility predicted by Fukuyama in his own work:
For if a country's goal is economic growth above all other considerations, the truly winning combination would appear to be neither liberal democracy nor socialism of either a Leninist or democratic variety, but the combination of liberal economics and authoritarian politics that some observers have labeled the “bureaucratic authoritarian state,” or what we might term a “market-oriented authoritarianism.”
Not only does it seem to be the case that this “bureaucratic authoritarian” style of government can excel in our economically competitive age, it also appears that countries like America and its European brothers are converging on this configuration, as evidenced by the events of 2020 and the ever-increasing amount of public surveillance initiatives and restrictions imposed on speech. It brings me no pleasure to point out that the techniques honed through decades of foreign manipulation are being deployed on the citizens of Liberal Democracies at this very moment.
We will also see more evidence of this convergence as the dominance of “Big Tech” continues to expand. These companies already have strangleholds over many industries via the power of “cloud capital”; the control of massive online marketplaces, services like search engines, communication platforms, and the implementation of algorithms everywhere and anywhere. They are even monopolistic in nature, a fact that has been recognized by the government when the tech giant Google was determined to have an illegal search monopoly in 2024 and an advertising monopoly less than a year later in 2025.
The “Big Tech” mega-corps are currently a fixture in the lives of many, influencing what they buy, what they eat, what they watch or listen to, and how they work, but as their power balloons to new heights, their quasi-governmental qualities will only become more pronounced, and their integration into the government proper more complete.
Yet the question still remains: If the situation is as I have described and we are indeed en route to an even more authoritarian market-oriented form of governance, what exactly are the reasons for this collective divergence from the values of Liberal Democracy? Is it pure greed? Communist subterfuge? A devious plot on the part of a Globalist Illuminati? Have we all simply gone mad?
To be continued…



"On top of all of this was government surveillance and censorship, using police and drones to monitor everyone’s movements and even applying pressure to social media companies to squash discussion which may be disadvantageous to their project. Speculation on the origin of the virus was highly discouraged. Even though there was a viral research lab (which notably had been receiving grants from the United States federal government) very close to the first recorded area of infection in Wuhan, the “experts” all seemed unanimous in decrying any accusation of foul play or mismanagement on the part of the scientists as “harmful conspiracy theories.”"
As you highlight in volume 2, once you have the means- its inevitable that said means will be used. Its just a matter of "losing once" and really time. Would comment, was working for bytedance at the time, and it wasnt just covid which was regulated, but anything which worked against the regime.
I can remember logging in to my shift, and within the early morning- before any actual investigation or all the facts had come out, Rittenhouse was already jointly labelled a "white nationalist terrorist" who had committed a terror attack by the big social media companies, and thus all talk of him or support of him was castigated away in the same way that any dissent on covid was.
Was never a fan of liberal democracy, but if there was a crisis of legitimacy, it was in the notion of democracy and the people being more logical than emotional, but instead we were a feckless mob. It challenged much of my faith.
»Besides, it could be said that “doing nothing” actually is the “will of the people.” Consider: would the public even want the measures required to combat the harm we’ve inflicted on our environment to be undertaken? I implore you to ponder this inquiry deeply. It seems highly probable that any severe contraction of the economy, even in the pursuit of objectively noble goals like “saving the planet,” would be deeply unpopular with the nation’s constituents, and would thus result in a devastating democratic defeat. Even more unpopular would be the rollback of any technological and/or societal progress made in reaching our current state of affairs. It may be easier for people to deal with the loss of lives than it would the loss of perceived “progress.”
Love this btw, but its something I often bring up with people and fair trade. While Im always for it to a degree;
what people ask for, versus the actual reality is so starkly at the opposite side of the material spectrum. In part, we do exist at the behest of the developing world, cobalt mines in the congo, cheap sweatshop labor in Vietnamese and Indian ghettos, Mexican agriculture where they are paid cents a day, etc.
Perhaps this may not matter much as now the third world is at our doorstep with the Indians reviving the old cooley system once more;
but to peel back the luxuries the West has grown rather accustomed to, to live as people in Spain, Portugal, the 2nd world, etc live-
its laughable. The drop in quality of life, increase in food prices, cheap chinese textiles that wal mart uses (let alone Shein), etc
It aint happening.