The Multiplicity of Meaning
This feels like a futile exercise at the outset. At some level, I feel like I am trying to explain colors to blind people. But, it has become increasingly clear to me that there are aspects of language people fundamentally do not understand, and they shed a lot of light on this whole AI situation.
I'm not going to try and reproduce the Course in General Linguistics. If you want to educate yourself on semiotics and epistemology, there are plenty of great texts and guides out there. Rather, I'm going to try and keep this as simple as possible so that people can follow it from first principles.
Our general conception of language is that it works kind of like a suitcase for meaning. I put together a string of words to make a statement; that statement is effective to the extent that it gets my message across. If I ask for eggs, and you give me fish, we are not communicating. This is functional language, transactional language. What matters at the end is that both parties understand what has been said.
There's a lot of evidence that says that language doesn't actually work this way, but we're going to put all of that aside, because the important thing is that this is the most common mode in which we use language. When you go to school, the model is that the teacher conveys information to the students who learn it. This is transactional. We all accept that language has this function. When the flight attendant gets up to give the safety address, we don't challenge the semantic stability of their statements.
This is, however, just the beginning. In addition to the functional layer of language, there are other layers. You know, like an onion. One of the earliest forms of narrative prose is the allegory. The structure of allegory is that the functional conveyance of meaning in the form of events and characters correspond to a parallel structure of values and ideas.
The reality is that our language is full of subtext all the time. Dr. Gates talks about Signfiying; various Eastern cultures are well-versed in indirect communication; we all know that passive aggressive person who doesn't mean what they say. Beyond the functionality of conveying meaning, the way in which that meaning is delivered can be its own sort of meaning, or completely change the functional meaning. Think about facetiousness (the only English word to have all five vowels in order), wit, humor, sarcasm.
But, we're still dealing with intentional communication. Beyond the layers of conveying information and the manner in which that information is conveyed is the referentiality of that conveyance to previous conveyances. If I suddenly bust out a "what you talking bout, Willis?", that statement has greater meaning through its referentiality than either its explicit or stylistic execution.
What you say isn't only meaningful in terms of what you're trying to convey. How you say it matters. How it has been said before matters. It matters enough that it can completely over-ride that basic, functional, transactional conveyance of meaning. Whether you mean it to or not. Yep, referentiality can trump functionality. The whole N-word debate gets lost in this reality.
Now, here's where it gets funky. I feel a little weird even saying this on a professional site like this, because this is as woo as I get.
Whether you are aware of it or not, the language that you use everyday, in functional, transactional exchanges, also has all of the resonance of stylistics, of referentiality, and of another layer entirely which is the common field of human consciousness.
Some people are aware of this all the time. Some people see it in glimpses. Some people only have this experience on psychedelic drugs. Some people would call it "head tripping" because of the particular type of experience being aware of it creates for them. But some people never see it.
For some people, they honestly believe (because it is their experience) that language is only functional, only transactional. They don't understand the rhythms or the cadences. They don't hear the stylistics. They don't catch the references. And the timeless echoing of the questions of self and world doesn't resonate for them.
This is where the AI argument falls apart. You see, if you believe that language is only functional, transactional, practical, then of course a machine can generate language just as well (if not better!) than a human being. If all you care about, if all that matters is that layer, then sure, once you solve the hallucination problem, you've got an effective language-using AI. An AI can pack a suitcase just as well as a human.
But that's just the surface. Language isn't only functional. It's deeply connecting. We learn language from our families; its meanings are tied up in every way we understand the world. As Francois Lyotard argues, all of our truth systems are narrative in structure. We relate to the world through stories of experience.
Storytelling is more than just stringing words together to create a specific meaning or payoff. It's an art form that has been practiced for hundreds of thousands of years. It invokes our higher selves and understandings. It resonates beyond our individual experiences. It connects us as human beings, sharing in a common experience of existing in this world, as these weird embodied mind paradoxes, across times and cultures and even languages.
领英推荐
The way that humans interact with language is fundamentally different from how LLM's interact with language. No AI is capable of writing compelling allegory. It can't maintain multiple frames of reference; it doesn't actually even understand one.
No AI can relate to us compelling stories of being human, because they don't have that experience. They don't know what it is. The thing that we cherish most in our art is the way it moves us, not that it simply exists. The AI cannot be moved.
The idea that you can replace the humans in the creation of art is absolutely absurd. It's the humanity that makes it art. The AI cannot participate in the grand conversation that has been going on for generations because it doesn't know what it is to live, much less to be in dialogue, much less to be in love.
You can try to separate "high" art from "low" art, and there are always going to be some people who don't care. Blind people may not care what color their couch is (or maybe they do; I don't have that experience; I don't presume to understand it). But, at the end of the day, storytelling, whether you do it with words or pictures or games or films or poetry or dance or music, is a human art form.
Treating it like it's not is disrespectful to every artist who has ever struggled to make something great.
Treating artists like public goods because they sell their work on the marketplace is disgusting. You may be able to train a machine to puppet a model that looks like a human being, and you may be able to pack enough suitcases well enough to convince people that they got something interesting out of it, but you're never going to be able to capture what made that person a person.
It's fundamentally dehumanizing.
Don't tell me that your corpse-puppet AI is going to connect me with my family or my favorite artist. This kind of grotesque imitation of life is something I have absolutely no interest in. I don't care that your shit is freeze-dried; it's still shit. I don't want to consume it.
The thing that's driving this, of course, is the profit motive. If I can produce more of what you want more cheaply, then I can sell it to you and take the market away from other people.
Why else do you dehumanize people if not to justify your atrocities against them? Isn't that where genocide begins? Isn't that the heart of Fascism?
There are four parties in this dance. There are the people who don't know better and believe that AI is going to be amazing because it does things they can't do; there's the people who know that AI isn't actually doing anything amazing, but they don't care because they don't get it either; there's the people who get it and are trying their best to explain it to everyone who doesn't get it.
And then there are the people who get it but are doing it anyway because it helps them to profit, to gain prestige, to be credible in whatever circles. Those are the ones I really object to. People who don't know better don't know better. Ignorance is bliss. But, if you know, and you turn your back on your fellow human beings, what does that say about you?
There have always been hucksters. I'm not na?ve enough to believe that we're going to move into some non-transactional utopia. But please stop trying to pretend that machines can do anything more than copy and remix.
"Generative" AI is a marketing phrase. It's not reality.
Game Writer working towards a career in Game Development | Lead Instructor with Black Rocket Productions
7 个月Really love the perspective this post brings.