Why And How Technocracy Will Fail
The tech billionaires want to replace democracy with algorithms. They’re about to discover why that’s impossible.

Our world, say the tech billionaires, will be a much better place when we recognise that algorithms know better than voters, that data beats debate and that their engineering minds will mean a much better run society. No more gridlock, no more uninformed opinions, just clean technocratic rule by the engineers.
We’ve been here before, nearly a century ago in the 1930s. That technocracy movement collapsed spectacularly within a year. And while today’s tech billionaires seem to hold the same ideas as sacred, but claim the technology is now ready, they will still fail. In fact, they already are.
It may seem that their ideals and visions are inevitable in these changing times, for we are entering a period of massive societal change. A period of revolution that will be as profound as the 1960s in the United States. And with some tech billionaires funding populist movements, it might seem they have the upper hand in gaining control. Driving us towards a society ruled by algorithms. Except that’s not happening.
Just as in the 1930, cultures immune system is kicking into high gear. What we are already starting to see is systemic institutional resistance agains tech giants and the technocratic movement. And while they may be very wealthy, they can’t win against the invisible hand of culture.
People, organising through civil action groups and through the legal system are taking actions in response to what many see as technological overreach. In part I think we have reached what systems thinkers would call “carrying capacity”, which is the maximum sustainable influence of technologies that a population can support.
The hype and aggressiveness of pushing Artificial Intelligence tools like Generative AI by the tech giants can seen as an example of this technological overreach. All the tech giants have pushed AI tools into all their platforms and whether wanted or not. Culture does not react well when a technology is forced upon it.
When foreign cultural elements (in this case AI, algorithmic manipulation and technocratic governance) are seen by a culture as threatening to its core cultural values (democracy, human agency and welfare), societies have, throughout history, mobilised multiple defense mechanisms simultaneously.
We can see this through legal assault on social media platforms. As of the end of 2024 there have been 815 lawsuits filed in the USA and high numbers in Canada, the UK and EU. Judges have ruled Meta, TikTok and others must face lawsuits for transgressions ranging from teen mental health to suicides.
The U.S. is ordering Google to break up. Governments are hauling in the CEOs to explain themselves, their perceived monopolies and their disregard for civil society as a whole. In anthropology terms we could call this a form of counter-hegemonic resistance.
The technocrats are also experiencing multiple points of failure in the systems they’ve tried to create. Individuals are increasingly using devices less with things like digital detoxes. Schools are banning smartphones in the classroom, litigations are ramping up, society is having more discussions about where and how they see technology’s role in society. And we’ve reached market saturation in economic terms.
History shows us that when complex systems face multiple pressures across different layers of society at the same time, they either collapse or go through rapid, fundamental changes.
The noise from the current White House administration ay have it seem the tech oligarchs are winning, but that’s far from reality. It’s white noise. Over 42 state Attorney Generals signed a letter asking the surgeon general to require warnings on all algorithm driven social media.
The EU and Canada are developing stricter rules and guidelines around privacy and data protections around not just AI, but data on citizens. The tech giants cry “government overreach” and that innovation will be stifled. Neither is true and good regulations have been proven to drive better innovations. The art is in finding that balance.
I think we have hit an inflection point. The systems the technocrats want to control are pushing back, the signals from the decline of social media use and the litigations unfolding further support this. The technocrats fundamentally misunderstood that their power was permission based and they are losing that sociocultural permission.
So what might happen? How might society and culture react?
We’re already seeing some shifts with “phone free” restaurants, the rise of analog games and analog gaming cafes. More face to face activities and people heading out into nature — without their phones.
As our digital interactions become more obviously manipulated by algorithms and we see more AI slop and struggle with what is and isn’t real, authentic human presence and experiences become more valuable.
We are seeing teens use social media less. We may even see some “digital sectarianism” communities emerge where there is collective rejection of certain technologies. There will be greater discernment on which technologies serve being human and which don’t.
As we see the growth of decentralised tools like Mastodon, Discord, BlueSky, Signal and Telegram, this shows people want more control. The major social media platforms have abused people’s trust and become what Corey Doctorow calls “enshittified”. Once network effects reach critical mass, the major platforms will be in trouble financially.
The business model of harvesting human attention will also become unsustainable. Being conspicuously offline will become a new status symbol.
Libraries, community centres, and civic organisations will gain new relevance as places of unmediated human connection. As sociologist Durkheim might say, we will create some new form of “mechanical solidarity”, with social bonds created based on shared physical realities.
We won’t be dropping these technologies and may welcome new ones. But we will favour the ones that amplify human compatibility without controlling it. Let’s call them Convivial Tools. Ones that enhance human agency rather than replace it.
Just as “organic” became a valuable label in the food industry, “human-made” and “algorithm free” will become premiums categories in education, healthcare, creative work and customer service.
Culture is always the ultimate arbiter of technologies. Part of the reason for the pushback is that society feels it has been imposed upon too much and needs time to figure out what to do with so much technology everywhere all at once. The technocrats’ fundamental error was believing optimisation is the highest human value. The cultural correction will restore balance, agency, and the irreducible complexity of human flourishing.
Yes, it will come. We are not in the 1930's: We now have better algorithms, better data and their interpretation, better forecasting and simulations, and we have bigger problems and challenges, more crazy dictators under the umbrella of democracies, and more omnipresent dis/mis-information techniques than 100 years ago, a more complex world, more internationalised.
If not now, then we will have better decision models for governance in 20 or 50 years time.
Great perspective on a growing influence in our society. I hope you are right.