Will Machines Reshape Human Cultures?
We are already co-creating culture with machines. But perhaps less so than we might realise. Will this change? Will we allow it?
Since the dawn of modern humans, perhaps even with some of our ancestors like Neanderthals and Denisovans, we have used culture to survive as a species. All of it generated in those dizzying, ceaseless neurons flitting about inside our noggins. Culture was the exclusive domain of us. Only we created it. Until now.
The very word culture is complex and perhaps the most complicated and weird word in humanity. It has many meanings. Generally it includes our various norms, behaviours and customs, rituals and traditions. It also includes political and economic systems, militaries, aesthetics (art, literature, music etc.). But only we could create culture.
With the arrival of Generative AI (GAI), think tools like ChatGPT, Claude, Gemini, Midjourney, Soros and more, machines have become co-creators of culture. Mostly in the area of aesthetics like literature, videos, music and images. And they have already influenced culture.
To what degree are the machines then, co-creating culture and will we have to share this co-creation with the machines? What does this mean for human culture in the future and the shaping of our societies? AI tools like GAI are now active participants in human societies. They are creating cultural artefacts.
Are Algorithms Creating Culture?
Perhaps the first question we must ask, is the one of creation of culture. Most everything created through GIA tools is based on a human prompt, or set of instructions. Whether we type them or speak them. GAI is based on predictions. Where the next word should go. And often enough, the machines hallucinate and make things up.
But humans too, have always made things up. Intentionally, such as lies, poetry, stories or misinformation, or accidentally. Culture then, is all about information and transmitting that information to others. From this information we create stories, which creates knowledge which turns into actions, such as cultural norms and customs and how we share a reality. Very fundamental stuff.
None of these GAI tools can do anything without our input at the start. They’re not all huddled in the cool darkness of a data centre brainstorming ideas in a design thinking session.
What we humans do create from GAI tools like Claude or Midjourney become artefacts as a result of the actions of the algorithms. When people create images, videos and memes then transmit them into society via digital channels, they become cultural artefacts, legitimate in that they have been shared and accepted, either happily or angrily. But they are there.
But are the machines actually co-creating culture? Perhaps in a way. But only through our actions, which means the original idea or desire, was human.
We’ve Always Created Culture Through Technology
When Grook smashed her first stone on another and created a sharp edge to cut mastodon meat, that technology, a crude axe or knife, changed human culture. She became the first chef and she shared her idea with the clan. One of that clan then shaped a better stone tool and now they could cut lovely fine cuts of a sirloin tip steak. Well, sort of.
We invented tools to create cave art and even used tools and fire to create moving images on cave walls. Many thousands of years later, we created the printing press, to which the Catholic church decried it should only be used to create religious texts. That worked out well.
Technology is a part of what it means to be human. We could not be where we are and who we are today without technology. We have always, through technologies, sought to connect our cultures and societies, sociocultural systems if you like.
So this begs the question then, just how much do the algorithms really influence and participate in cultural creation? Maybe GAI will turn out to be like a printing press, paintbrush or pen and paper.
Machines Lack Cultural Context
Some rather cool videos for adverts and short films have been created from GAI tools. Some scary ones too. But there have always been those nefarious types creating nasty things. Whether good or bad, the outputs of GAI tools are based on human cultural context. The machines, the algorithms, do not what they are creating. They don’t understand the context. Machines do not understand culture.
Algorithms are always learning. But they aren’t learning culture. And the majority of the algorithms today are designed through the lens of capitalism; to extract monetary value from consumers. This is a shortcoming because it places necessary limitations on GAI tools as a singular outcome.
This is not the nature of culture. Monetary value is one lens only. Much of art, architecture, fashion, music and literature is cultural commentary and critique. A machine cannot critique and comment if it does not have sociocultural context.
Through para-social relationships with algorithms and machines, through participation with these machines, we are training the algorithms. Then they evolve and the cycle continues in a feedback loop. But are the machines actually gaining cultural context?
I would argue no, they are not. To gain cultural context, to understand culture (again, remember this word is very nuanced and complex), one must also have consciousness. Intelligence is fairly common across all us animals, from humans to crows. Cultural awareness and the ability to imagine, then create and communicate consciously, is in just us animals, Homo Sapiens.
Co-Creating With Machines
Culture may well be the most powerful technology humans have ever developed. It is what has kept us alive and thriving as a species for many thousands of year. It is also mutable, ever changing. Could machines possibly understand the vast nuances, complexities and shifting nature of culture?
If we look at the energy required just to run GAI and other AI tools today, I’d argue this might be possible, but not with today’s energy availability. Nor does any AI tools have consciousness. Something we humans don’t even understand fully.
And if machines did become conscious, did invent culture, it would have borrowed some elements of being human, but by it’s very nature, it would a machine generated culture. Machines cannot think like humans. That too, is dangerous to us and the existential threat we see it to be.
We may, someday, create such machines. But GAI, such as Large Language Models (LLMs) are simply an ignorant participant in culture today. Machines do not understand the why of what they do. There is no awareness.
These Generative AI (GAI) tools have been developed by a portion of society that holds a belief that all of what humans do can be turned into algorithms. As I’ve long said, that is culture on a diet.
Yet we are co-creating with machines. We’ve done that with the printing press. Algorithms too, have been present in our societies since the first one was created in the 9th century AD in what is today Iraq by the Persian polymath, Abdullah bin Musa al-Khwarizmi, known as the father of algebra.
To varying degrees, Artificial Intelligence through tools like GAI, are already participating in our cultures. But we are still guiding them. Feedback loops through their learning are evolving how, when and where they participate in culture. But machines are participants.
So the question remains; what kind of a participant are machines? Since they cannot think, reason like us or have consciousness, we have to ask different questions around how we want them to be co-creators of human culture? That is much harder to answer and will take time.