Technology Doesn’t Solve Problems
We tend to think technologies solve problems. They don’t. Humans do. A change of perspective means more opportunities.

The first electric typewriter launched by IBM in 1961 saw massive hype in the news media. It was heralded as a giant leap in solving the struggles of human productivity. Much the same was said about PCs in the early 1980s. Today, we see much the same hype and claims around AI.
Throughout history, we have applied mystical meanings to technologies solving human problems. Except they don’t. Never have. A hammer does not build a house. A pen does not write a novel.
When we shift how we perceive and understand technologies, it opens up an amazing way of finding new opportunities and remarkable possibilities. Here’s why.
Technologies give humans extended capabilities. Technologies are solutions to help us solve problems and all technologies are from the human imagination. We inherently know that our hands alone can’t fix a problem, that we need something to help us. This is why we are a tool making species.
What we often find too, is that quite often, we need to combine tools and evolve them in new ways because quite often, when we use a tool to solve a problem, we often create or see, new problems or challenges as a result of our advancement from prior technologies.
Language begat the need for writing. But writing itself then presented the challenge of recording and preserving human memory. The real solutions to these problems came from creating and combining new technologies such as books, libraries, education systems and ways to share knowledge, such as the internet today.
Large Language Models (an AI tool) don’t solve the problem of understanding human language; they just provide humans with new tools to process and work with language. The real solutions come in how we apply these tools in specific ways, or contexts. Such as healthcare and education.
Technologies often appear to solve problems, what they actually tend to influence the most is reorganise human social relationships and power dynamics. Information technologies create new hierarchies of visibility and power, reshaping how we interact with one another, often resulting in more sociocultural complexity. Just look at how complex bureaucracies in both business and government have become.
We’ve seen a dearth of project management software tools enter the market in recent years. They are a response to the growing complexity of information systems and networks. But these tools, like Asana, Jira, Trello, Notion are just new implementations of bureaucratic tools like forms, schedules and approval processes. They don’t actually make teams work better together, they just standardise and regulate how teams collaborate.
Technologies are physical manifestations of our inherent problem solving abilities. Technologies can’t solve problems independently, they are extensions of us when we solve problems. When we understand this, our whole approach to, and understanding of, technology, changes. We can better see them for what they really are, and just how amazing our brains are.
We have long tended to anthropomorphise technologies. We do this as new technologies often result in cultural anxieties and hopes. During the Industrial Revolution, we often described steam engines has having moods and temperaments. Factory workers would say machines were “angry” or “uncooperative”. Today we describe some AI tools like LLMs as “thinking” or “learning”, yet they do no such things.
We’ve anthropomorphised computers so much that we have turned the tables and now speak of humans with terms like “that doesn’t compute” or “she’s like a computer”. An interesting plot twist. Early telephone switchboard operators were often seen as mystical and magical humans, deeply interfacing with telephones. Alexander Bell referred to telephone networks as a “nervous system” for America.
Back in the 1950’s and 60’s, books, media articles and companies talked about mainframe computers as if they were electronic brains, like all-knowing oracles. Much the same way Egyptians saw totems as oracles. See the patterns here?
Just as a steam engineer never “wanted” to turn a wheel, no telephone system was “thinking” about connecting calls. Today’s AI tools like LLMs aren’t “understanding” or “thinking” about anything. These tools are, arguably, bureaucratic technologies that reorganise how we relate to information and each other.
Social media platforms often claim that algorithms solve the problem of content distribution. They do not. They are reorganising social relationships bureaucratically. They’re creating new hierarchies of visibility (think influencers here) reshape how we interact and result in complex rules around who sees what and when. This is in large part why social media doesn’t solve the problem of human connection because they create new layers of mediation in how we relate to each other. This disconnects us from our real-world ways of forming social relationships.
Customer service chatbots aren’t really solving problems. They’re reorganising how customers interact and engage with a brand, creating new social scripts and expectations. What chatbots are doing is primarily to serve and regulate human interactions, rather than enhance them.
We perhaps put human attributes onto new and advanced technologies as a way to make them more comprehensible and less threatening.
The real innovation of technologies, especially digital ones, doesn’t rest with in circuits, algorithms and interfaces, but in how we, as humans, use these tools to address the increasingly complex challenges and problems we face.
The future isn’t about what technology can do for us, but what we can do with technology.