The astonishing progress in artificial intelligence has unsettled our self-conception. We humans have always carried a generous estimate of our own importance. Anything that threatens our position, anything that depicts us as anything less than sacred is prone to being met with vehement rejection.
I feel that this explains much of the contemporary pushback against AI: The fact that many of us see it threatening not just our position in the job market but also our special status among the living (and artificial) beings.
In many ways, this has happened before. Consider some scientific turning points: Copernicus argued that we are not the centre of the cosmos. Darwin showed that we are just another animal among the living beings. Einstein revealed that, in the end, it’s highly questionable whether we understand the reality at all. These revelations were all highly controversial at the time and there is a simple common factor: each one was, in one way or another, an attack against us.
Many feel that artificial intelligence is rapidly serving us another moment of humility — and history repeats itself in terms of the reception. People are inclined to downplay its significance. They are often looking for ways in which AI is not performing well and sharing these with glee.
This seems like a kind of a coping mechanism: We hold on to the ways we will always remain special. Many of us search for something unmistakably human, something not merely difficult to automate, but fundamentally beyond imitation. Something that would set us apart from anything AI could ever do.
Many have chosen creativity.
While it’s clear that AI models can create something, it’s easy to argue that this is not real creativity; rather, a mere reflection of the training data. “Real” creativity is something else as the AI is merely combining existing pieces.
I think this is an interesting question: Where do we draw the line? What counts as “real” creativity? I don’t think it’s about originality, really. In real art, almost everything is a reference to something that already exists. In fact, doing something so utterly original that it has no connection to anything that ever existed before seems unlikely to do well.
What I often notice is that AI has no sense of style. Most LLM models are surprisingly bad at producing anything with “character”. Do you know what I mean? It’s always so sterile somehow. Plain and uninspired. It’s not easy to put a finger on or even describe in precise terms. It’s more like a subjective experience I share with many people using these models.
Maybe there is something more to this. Maybe this lack of character is a symptom of the model lacking the human experience, which, in turn, means it’s always merely imitating a connection with us.
Here's a question: Is what we call "creativity" not, in fact, the ability to produce some given output, but a capacity to express one's subjective experience (which these models fundamentally lack)?


