We Choose Our Delusions, Not Our Reality
illustration produced via DALL-E and ChatGPT

We Choose Our Delusions, Not Our Reality

It's quite a popular thing these days to speak of "my reality" and "your reality" and the reality of others, but what we're effectively referring to are shared delusions. That's not to say there isn't something real out there that encompasses deeply all that exists in the universe. Far from it; each of us holds sufficient experience entangled within that reality to know it persists outside of us despite our inability to access the infinite order contained within it. Yet humans can only construct for ourselves much simpler maps or models of reality, none of which result in our inner world producing a fully faithful rendering of the universe we share. In other words, the best we can possibly aim for is the construction of useful delusion, so why pretend otherwise?

Indeed, culture is the ongoing production of shared delusion. Through repetition, each of these delusions can form useful currencies which bind us together and allow us to collectively navigate that infinitely complex reality. To to be able to read this, you first had to gain fluency in a symbolic language with many evolving variants tied to that culture. That fluency, although built on very useful delusion indeed, nevertheless grants you access to an internal narrative which you're attempting to re-assemble for yourself. Relying on word patterns internalised over many thousands of conversations since the time you were first able to process language, you don't even need to direct your conscious attention to individual letters anymore. Still, even if you find resonance here, it's virtually guaranteed that what you reproduce and re-interpret for yourself will have lost something in translation from the internal framing and emotional colouring within me that produced the words you read. Less will be lost if I were to read this out loud for you, but there's ultimately no way for me to capture the entire narrative flow and emotional subtext in its entirety, no matter now many bits of information I have recorded.

Who can deny there are gaps between our life as we experience it, the narratives we share with others about it, and whatever durable record of our existence might be entangled in the universe long afterwards? We can't even hope to measure the full size and extent of these gaps, no matter how optimistic we might feel about the possibilities of bridging them. It's long past time for us to admit to ourselves that we have no special claim on reality. Nobody else does either.

So when we find ourselves inspired by the hallucinations of machines which have been trained on a great deal of genuine human output moderated by an increasing amount of machine assistance, will we be capable of recognising this as a crescendo moment in this grand cycle of human delusion? Will we recognise that the real accomplishment of the current iteration of "artificial intelligence" was co-opting the imaginative power of our narrative-driven society that lets us trust so deeply in our own delusions? Will we finally recognise the artifice of all the imagined narratives that moderate our daily social intercourse with other humans?

Of course not, my friends.

Our world hasn't changed so much in the thousands of years of recorded time. We still build our social structures with layer upon layer of shared delusion, some of which proves to be useful. We still create artificial life forms from memes to nations and religions whose sole purpose is to capture human attention and concentrate power via authority figures who are no more acquainted with the ultimate nature of reality than you or I. We are still driven by our intentions, and remain reliant on the waves of luck to cross paths with opportunity.

We prefer our favourite delusions to reality, and we will ever treat them as if they were reality.

Paul Fidler

Technical Solutions Architect

5 个月

If we start to believe the words of AI output as Gospel, then we are doomed. LLMs are trained on publically available data. And when tgere are people writing that the world is flat and Elvis works down the chip shop, then it kinda devalues the output. Let’s not forget that Google had us believing that we needed to eat rocks a few weeks ago, and that Amazon’s “no check out” in their stores was a bunch of people watching video remotely in India, we need to lay off the AI coolaid a little.

Intriguing perspective on the interplay between perception and reality – it really highlights the complexity of our cognitive processes and social interactions.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了