The Coders Bias and Sociocultural Impacts
Why believing “everything is code” blinds us to what makes us human, and limits the future of AI and technology.

Franco sits in the dim light of the evening in his condo. The three screens glowing. To him, the world seems like a series of puzzles just waiting to be solved. Love is an algorithm, politics is data, even grief could be mapped if the right code could be written. It’s not arrogance or ego. It is the result of years of writing code, bending and twisting complex systems into logic. Having seen the wonders of what can be done with code.
Today, the coders tell us a world of abundance awaits us. That we will have plenty and a techno-utopia awaits. The algorithms will solve for all. I have come to call this Coders Bias, much like confirmation bias. We all have biases. They are lenses that shape how we see the world and interpret reality, created through myth-making and storytelling.
With regard to Coders Bias it is like wearing glasses that only show you numbers. One can see the numbers, patterns, structures, but there is no colour, emotion or story. The deeper entrenched into this bias, one only sees a flat outline of the world, of life, mistaking the outline of the whole picture of being human.
We can even some evidence of this in the world of physics and astrophysics. There’s the famous clip of Dr. James Gates being interviewed by Dr. Neil DeGrasse Tyson when Gates reveals that there is code in the substrate of the universe. Being he’s much brighter than I am, perhaps there is. But that doesn’t mean it’s a code we can truly understand and it doesn’t mean we can reduce all life to the function of code or the Great Coder in the Sky.
We humans, we’re a noisy, messy, rowdy, passionate and curious lot. We are each our own minds, shaped by the magical tapestry of cultures that makes us so fascinating and interesting. We like to think we can impose order and it can be comforting to think all can be reduced to algorithms. But that is culture on a diet. Human culture is never on a diet. It is hungry with curiosity and passion.
The word culture in and of itself is one of the most complex across many languages. Some just consider it to be music, art, literature, fashion and architecture, those elements are the aesthetic element of culture. But culture also means how we govern our societies, politics, militarism, economics, religion.
Anthropologist Clifford Gertz has described culture as “webs of significance” that we ourselves have spun. Coders have come to see these webs as literal code. Matrix anyone? In their daily work, everything can be turned into code. And it’s pretty heady stuff when you create code that turned into something that saves lives, improves the world. Coders have worked some pretty impressive magic. They are smart people. Over time, coding becomes a cultural frame of reference.
Our world in many ways has become one of rationalisation. Yet in others, it is as messy, if not a little messier right now, as always. We see it business and in many governments, this tendency to systematise and quantify everything.
Coders carry this tendency to rationalise forward. And with GenAI like ChatGPT or Claude, this becomes even more seductive for them. Not all of course, but enough. In anthropology, we see myth, ritual, emotion, the messiness of being human. Coders see data structures, neural nets and pattern recognition. This isn’t bad or wrong. But it becomes dangerous when the bias is the only lens with which one sees the world. And we all have our biases. They are difficult to set aside, especially if we are confronted with something that threatens our reality, our world view.
It becomes especially dangerous when we choose to ignore our biases and believe we can change the world to our view. And when we engage in cognitive dissonance. Many coders live in a world filled with hackathons, GitHub repositories as their library of reference for the world, the hallowed halls of technology conferences, Discords and subreddits.
This is why we see the hype energy of Silicon Valley, the rise of effective accelerationism (e/acc) and Transhumanism and the ideal of techno-utopianism. They believe that every human problem is solvable. AI must therefore, be capable of becoming super-intelligence or Artificial General Intelligence (AGI), where machines will surpass humans. But we have no real idea of what intelligence is and even far less of what consciousness is.
This mindset and bias makes it incredibly hard for technologists to see where AI fails; context, nuance, irony, cultural practices. They miss the human layer. Cultural and meaning can’t always be expressed in logic trees. Sociologist Sherry Turkle has warned of this. Of our reliance on machines to stimulate conversation and companionship without the depth of actual human interaction.
Coder Bias is exemplified in Mark Zuckerberg and his belief that humans would naturally want an AI companion for everything. He is not alone. There’s a reason Microsoft has labelled their GenAI tools (via ChatGPT), Copilot.
What I find most unfortunate about Coder Bias and how strongly it is believed and promoted, is that we lose the opportunity to imagine and create better technologies, especially with AI. Coder Bias in the field of AI (LLMs specifically and Machine Learning) is that it shuts out an ability to create interdisciplinary collaborations. The social sciences are seen as hampering. That the humanities are the bane of illogical and boring thoughts. But too, there are many in the humanities fields that are locked in their own biases, seeing AI as a threat and unwilling or too frightened, to try and be a part of evolving AI tools. This too, is unfortunate.
The thing is, as I’ve written before, you can’t reduce culture to algorithms. It is ephemeral, messy, always evolving in unpredictable ways (much to the chagrin of those in the field of predictive analytics.) Culture is the reason we build systems at all.
There are times I’ve worked closely with coders, and I do enjoy it because we always both learn something. When we all drop our biases, when we see them for what they are, we become more open to realising opportunities.
The irony is that according to information theory, most poetry has a higher information content than words/text from almost any other source, because information is a measure of uncertainty. This is lost on most people, perhaps math types especially. On the other hand, keep in mind that modern algos are designed to capture and keep your attention. According to the book The Psychopath Test by Jon Ronson, most purveyors of modern psychology psychiatry media including trad media soc media advertising books film etc know that to get the attention of humans you need just the right kind and level of what he calls "madness." The fundamental goal of most (ai) LLMs is to get your attention, basically marketing for the parent company. Does a pretty good job of that. Not that other benefits don't accrue.