Are Algorithms Shaping Culture?
Think algorithms are changing and shaping us? It’s probably the opposite and that’s exciting.

Everywhere in our digital world, algorithms dash hither and thither through the aether. From our workplace apps to streaming services and social media platforms. Many a pundit suggests that the algorithms are changing us, that our agency and freedoms, are being decided by algorithms. Yet, perhaps, they aren’t. Quite the opposite.
Maybe we’re the ones reshaping and changing the algorithms. At least in Western and more democratic countries and perhaps even in more autocratic nations where they may be operating under the illusion of algorithmic control of their societies.
I argue here, in a condensed form, that in the face of human culture, algorithms aren’t changing us as much as we might think, that through culture itself, the algorithms will likely serve us more than they serve machines. Which is actually quite wonderful.
A good framework to consider this from is through the lens of German sociologist Hartmut Rosa’s theory of social acceleration and resonance, with a sprinkle of ideas from anthropologist David Graeber and sociologist Clotaire Rapaille’s cultural archetypes and code-breaking.
For many, algorithms and their role in society today is viewed from the lens of the culture in which one has grown up. We all have our cultural biases.
Rosa argues that modern society, in this digital age, experiences three forms of acceleration: technological, societal change and the pace of life. Algorithms play a role in all three at the same time. This creates a sort of frenetic standstill, or constant change without meaningful progress.
Paradoxically, recommendation algorithms across platforms like TikTok and Instagram create endless content consumption for the dopamine receptors in our brains. Yet a humans, we feel we’ve gained nothing from them and often feel we’ve lost time.
Japan is a culture that represents the extreme of this acceleration through the concept of “karoshi” or death by overwork. This is amplified by workers being always connected through their devices.
Yet in Nordic countries we see practices such as Lagom or Hygge which serve as a deliberate counterbalance to algorithmic algorithmic acceleration. Bhutan measures success through Gross National Happiness rather than GDP in Western societies.
All of these mindsets influence algorithms in various ways and end up forcing the algorithms to adapt to cultural behaviours, customs, norms and practices. While a sort of cultural feedback loop occurs with the mutual training of algorithms, it’s important to note that ultimately, algorithms don’t understand context and are rather good but in very narrow ways.
It’s also a co-evolutionary process in some ways. The algorithms of social media platforms reward certain types of content creation styles, which can shift some forms of cultural production. But human’s evolve and change and so can and do push back, unconsciously often, against the prevailing algorithms.
Humans have also evolved the practice of “algospeak” where we’ve modified forms of language to avoid algorithmic detection. This represents a sort of power dynamic and so far, the humans are winning. The expense to reconfigure the algorithms is a game of whack-a-mole for computer scientists.
Recommendation algorithms function differently across cultures as well. They must work differently in individualistic cultures like the USA or Canada versus collective cultures like Japan and China. There simply cannot be one universal algorithm that serves the entirety of humanity. To think there can be is the bane of technological determinists who see the world very narrowly.
There’s temporal factors as well. Western societies are also very much “now” societies, whereas many Asian cultures are long-term. These are deeply embedded cultural values, fairly immune to algorithms that don’t comply.
Algorithms have a more indirect influence on societies and cultures than we might think. Rather, it is a complex dance where cultural forces push back, adapt and ultimately these forces transform the algorithms themselves through what could be called cultural imprinting on our digital world and lives.
We speak of technologies taking over societies, but they don’t. While algorithms today may seem to have immense influence over humanity, we’ve yet to give them that agency. Ultimately, algorithms, like any technology, must pass through the filter of human cultural practices, which adapt, reject or transform them based on cultural values, norms and customs.
Like the “invisible hand” of economics, there is an invisible hand of culture. As cultures around the world become more digitally literate, we are getting better at shaping the algorithms more than just passively consuming them. This will create a richer environment of possibilities.
While none of this guarantees any sort of utopia, nothing can and it’s a silly concept anyway, it does show that humans, through culture, are still the ultimate arbiter of any technology. Our future is much brighter, much more vibrant than the techno-determinists might think, try as they might.
It is much more likely we will evolve, as we have with other technologies, novel approaches that humans and algorithms could never singularly produce. This offers interesting new possibilities for innovation as well.