2024 Was the Tipping Point When We Surrendered Our Culture to Technology
Will 2025 Be the Beginning of AI Cancel Culture?
What was the most significant thing about 2024? It’s obvious and quite reasonable to point to the election of Trump, but the thing that stands out the most is less that particular political development than a deeper cultural trend that has been brewing for a long time and has now finally became perhaps the most decisive factor shaping our lives.
In my final column before the election, I pondered ten questions that the election would answer and tossed in a final “bonus question” that I added because the possibility of it having become a hegemonic reality haunted me as I watched the first scenes of the horror movie that is Trump II.
The question was this:
Will we find that we have so thoroughly leapt into the post-factual universe that none of the conventional realties and political norms that informed the questions above even matter anymore? A brave new world awaits us whatever the results. Do we even have a clue as to how to navigate it? At present, there is very little evidence that we do.
Amid the flurry of post-election analysis (mine included), many on the Democratic side of the aisle are considering what tactical or ideological mistakes the Democrats may have made and how progressives might reinvent themselves to blunt the momentum of the MAGA right. Of course, all this chatter rests on the assumption that there is another presentation of facts or values that might be more persuasive to a rational reality-based American electorate. Perhaps, though, the problem is not that people don’t have the right information, but that they have too much of it—so much so that there is no longer a single shared reality to which they can collectively refer.
In the information age, it is sacrilege to suggest that we need less information, that anyone who does so might just risk being deemed not merely a Luddite, but, even more damningly, completely irrelevant. As Neil Postman noted inTechnopoly: The Surrender of Culture to Technology, his classic interrogation of our relationship to technology:
This is the elevation of information to a metaphysical status: information as both the means and end of human creativity. In Technopoly, we are driven to fill our lives with the quest to “access” information. For what purpose or with what limitations, it is not for us to ask; and we are not accustomed to asking, since the problem is unprecedented. The world has never before been confronted with information glut and has hardly had time to reflect on its consequences.
Writing in 1992, Postman clearly saw the trajectory of the ongoing devolution of our culture as we rushed headlong into the future seeking to address the great problems of our age with the very thing that was fueling them. Our dilemma was and is not that we do not have access to the proper information, but that we have lost any sense of how to filter the onslaught of it in order to provide ourselves with a comprehensible cognitive map of the world.
The result of this phenomenon is a growing collective bewilderment in the face of the loss of context, meaning, and common purpose combined with a growing addiction to every new technological device the evangelists have to offer us.
Postman observes that as the world became more and more incomprehensible, we simply sought more ways to enhance our delirium: “The fact is, there are very few political, social, and especially personal problems that arise because of insufficient information. Nonetheless, as incomprehensible problems mount, as the concept of progress fades, as meaning itself becomes suspect, the Technopolist stands firm in believing that what the world needs is yet more information.”
This left us in a world that Postman labeled “Technopoly,” where we have lost the ability to make our way through the chaotic landscape we occupy because we lack any defense against useless or false information:
Indeed, one way of defining a Technopoly is to say that its information immune system is inoperable. Technopoly is a form of cultural AIDS, which I here use as an acronym for Anti-Information Deficiency Syndrome . . . More important, it is why in a Technopoly there can be no transcendent sense of purpose or meaning, no cultural coherence. Information is dangerous when it has no place to go, when there is no theory to which it applies, no pattern in which it fits, when there is no higher purpose that it serves.
Three decades later, after the saturation of every inch of our social environment with some form of information technology to the point where it has come to define not just our social, cultural, economic, and political spaces, but our very identities, we have become utterly defenseless against the firehose of information. Hence, it was only a matter of time until our sense of reality itself became so manipulable that we were easy prey for a strongman whose disdain for the facts he hates now defines our era.
It doesn’t matter if anything Trump says is true or if it runs against the grand, antique master narratives of history, science, law, or any other appeal to tradition or morality. In the age of the totalitarian tech bro, the truth is what they say it is and if enough of us believe it, it becomes a self-fulfilling prophecy. We have no precedent to which to refer to guide us through this, and no one knows whether it will end as tragedy or farce.
What we do know, however, is that the old tools no longer work.
AI Cancel Culture?
If MAGA world was sick and tired of all those snotty professors and other coastal elites scolding them and acting like they were better than the regular folk, how will they respond when AI is the smartest kid on the block and eventually disrupts, deskills, or eliminates their jobs?
Recently, the “Godfather of AI” has been warning that it may not be too long until our technology is more intelligent than us and that, to put it mildly, might not be such a good thing.
As the Guardian reports:
The British-Canadian computer scientist often touted as a “godfather” of artificial intelligence has shortened the odds of AI wiping out humanity over the next three decades, warning the pace of change in the technology is “much faster” than expected.
Prof Geoffrey Hinton, who this year was awarded the Nobel prize in physics for his work in AI, said there was a “10% to 20%” chance that AI would lead to human extinction within the next three decades.
Previously Hinton had said there was a 10% chance of the technology triggering a catastrophic outcome for humanity.
Asked on BBC Radio 4’s Today programme if he had changed his analysis of a potential AI apocalypse and the one in 10 chance of it happening, he said: “Not really, 10% to 20%.”
Looking on the bright side, there is an 80% to 90% chance of this not happening but perhaps, even if we persevere, it’s high time that we start paying attention and stop passively accepting the argument that new technologies are good just because they are new. It might also be useful to ask whether the question is not always HOW to let our tools transform our lives but IF we should let them do so and how we should respond to unwanted, destructive changes in our lives and the culture. But that, of course, assumes that we still have a choice left.
Perhaps we should insist that we do.