Why Retail Automation Drove Customers Away
Self-Checkouts have failed. Not because of theft. But for a deeper, more human reason. Micro-Social interactions. What?

Ona recent Saturday afternoon I had to dash down to my nearby grocery store, a national chain operation. I was cooking for friends coming over that evening. I just needed a few items, easy enough for self-checkout, but I chose the human cashier option, even though there were a few people ahead of me. That was an unexpectedly good decision.
In front of me, the cashier was having a bit of what seemed an intense chat with the customer. When it came my turn, she asked if I took a certain street home and I said I did. She then told me to take a different route, the fella ahead of said there’d just been a car accident. This was a micro-social moment, a sharing of community information.
If I’d taken the self-checkout path, I’d have ended up sitting in traffic for ages. While sometimes self-checkouts can be convenient, they generally serve to create social disconnects and often result in social micro-aggressions, such as not scanning an item or being frustrated with the employee there to help. You’ve probably experienced this.
Today, even large retail chains such as Walmart are downsizing or removing their self-checkouts. The broader claim being that they’ve lead to higher “shrinkage”, retail-speak for shoplifting. But I think it goes much deeper than this. That they’ve become a metaphor for when culture rejects certain technological “advancements” at scale. So what’s really going on?
On the one hand I call this the “automated efficiency paradox” and also a “digital alienation cascade”, where gas stations started with self-serve options, then retailers with self-checkout and banks with digital only banking and shrinking teller services. Companies that use chatbots for customer service and HR departments using AI for screening and sometimes even interviewing initial candidates.
Most of these industries and especially retail have found themselves stuck in this efficiency paradox. Some are understanding the resulting problems at some level, either due to increased theft and others that the expected efficiencies haven’t arrived. Turns out the back-end costs of operating self-checkouts isn’t saving as much on labour costs as they had hoped.
Running self-checkouts means software licenses, machine repairs, system maintenance and employee turnover increasing training costs.
But what these forms of automation actually create are what sociologist Emile Durkheim would call “anomic spaces”. This is where our normal social behaviours and the way to connect to one another have a negative effect because we remove them. Social norms breakdown and when that happens things like theft and angrier social interactions occur.
Consumer displeasure at self-checkouts is often expressed as humans not wanting other humans to lose their jobs. But this is just a surface reaction because it is, as I argue, more socially complex than that.
Using self-checkouts or HR departments using AI tools and brands chatbots creates societal gatekeepers. Brands spend heavily on trying to build and maintain a good reputation with their markets. In an economic time when there is growing distrust and resentment towards brands, introducing digital gatekeepers has just reinforced the consumer bias that brands are out to get them, that shareholders come before customers.
Anthropologist David Graeber would call this a form of “structural violence.” That these systems force a company’s approach to doing business in a form opposite to how customers want to do business with a brand.
My story at the beginning of this article is representative of the very hard to measure, but significant societal value of micro-social interactions. These are far more valuable to consumers than most businesses tend to understand. They serve a much deeper social function that is part of a society’s fabric.
Most of the reason we return to certain stores or businesses from a garage to a bank, is because of those micro-social interactions. Seemingly inane banter that we think nothing of, yet they help us feel part of a community, sometimes sharing critical information, at other times, for some people, helping them feel less lonely for a few moments, or reinforcing cultural norms. Humans are social creatures.
Business automation at the customer level can disrupt this social function. Implicitly, both consumers and businesses know that the automation is designed to benefit the business, not the consumer. Yet some businesses do find a more balanced approach. And tend to deliver value to them and the consumer over the longer term.
Places like grocery stores, malls, coffee shops are places of social currency, sort of societal structural nodes in a society’s cultural fabric. Businesses turning away from this create alienation, with less customer loyalty and decreased emotional connection with the brand.
In Japan, they’ve long understood this dynamic and use the concept of “omotenashi”, of placing human interaction first, and leveraging technology efficiencies behind the scenes. Trader Joes spends a fair bit of effort training their employees to be experts and ready to hand. They are finding ways to strengthen rather than erode the social contract between business and society.
When consumers have regular interactions with employees it creates what we call in anthropology “routine social anchoring”. It’s a significant reason consumers return to certain coffee shops and stores. These interactions often result in a release of oxytocin, a feel-good chemical in our brain.
Starbucks long prided itself on being the “third place” for people, a model which eventually seemed to have collapsed, but rather than try to understand why and re-design it, they’ve increased automation, created greater divides with customers and are now regulating how long a customer can sit in the cafe. The results will take time to see, but breaking a brand’s social contract is a huge risk.
In the coming years, the businesses that push technology efficiencies behind the scenes and enhance human to human interactions will gain value over the longer term. There are enough examples to prove this model works better. Those that don’t will remain in the Automated Efficiency Paradox and will see their value over time erode.
In my last article I wrote about why technology doesn’t solve problems, that humans do. This is yet another example. As companies rush to stick chatbots and AI tools between themselves and the customer, they might do well to consider the sociocultural system in which they’re operating and that alienating humans often costs more than the perceived benefits of automation.