chatGPT is as dumb as a Goldfish
Hey, this is the 2023 A.I: A stubborn goldfish that wants to sing. Most posts don't explain why chatGPT may not work for most startup ideas. One of the reason being chatGPT is having a very short memory, like a goldfish. The title is certainly mostly a joke, though I do want to show you at least one easy way to have one way to visualize chatGPT :)
Logistics: There are terms (Fine-tune, Foundation Model, Whisper, xxx to xxx) that may be unfamiliar to people who are not familiar with the topic. A reference list is included at the end as a helpful guide.
This article will explain why.:
Here are a few cases where the foundation model would fail in real world use case.
chatGPT cannot summarizing / rewriting, or remember what you said 10 pages ago. Your startup idea will soon getting into “separate the input into multiple sections, and try again” OR, let’s try how long chatGPT can continue keeping what we talked about.
Both chatGPT - GPT-3 has 4000 token limit. Tokens are NOT words NOT letters. They are seperatable units, including whitespace. chatGPT and Gpt-3 Won’t even let you input a long article. Think about a 10 pages long book story. Offcial tool from OpenAI explains this: https://beta.openai.com/tokenizer
chatGPT cannot extracting multiple topics from your thoughts. It probably already hurt for most people even to think about "copy / pasting a long email thread into chatGPT". Human is soooo smart, that they try NOT to break the delicate tool, such as chatGPT. Imaging an email thread keep switching topics between friends, try ask chatGPT to "can you tell me what exactly these people are talking about?". Chances are, even human may not be able to do so, not to mention current A.I.
Transcribing a video with music is still not easy to do. As "Whisper" attempts to transcribe the audio before and after the music, it does not work perfectly. For example, there was someone start speaking "Hi Everyone", but the music kept "playing". In this example, the sentence "I can't wait for the weekend to begin" was repeated hundreds of times before the AI started listening to human speech. :shrugg.
"I can't wait for the weekend to begin ....<repeating 500+ times>.... I can't wait for the weekend to begin I can't wait for the weekend to begin I was not arrested on my journey to California for it I was not arrested on my journey to California for it Augusta and Adeline lived their truth. They took risks. They fought for what they believed They fought for what they believed They fought for what they believed They fought for what they believed They believed in the power of diversity and inclusivity and growth"
领英推荐
Hey, this is the 2023 A.I: A stubborn goldfish that wants to sing.
Think as how modern “A.I.” thinks. Deep Learning are mostly condense relationships between inputs. The quality are highly related to the correlations humans want them to be.
Wait, but "fine-tune" saves everyone!
You may have heard "Fine-Tune", it should solve EVERYTHING! Not really. I can write another article about this. Let me know if you like to discuss more.
Foundation Model is Golden, but everyone needs A.I. pipeline.
Don't get me wrong! Embrace the foundation model! Foundation models are so valuable because they were created with a lot of human labeling, parameter experiments, and millions of GPU hours (and thus money) spent training on public internet data to create a foundation model that everyone can easily use in practice. ChatGPT is a good starting point, but A.I. data engineering is still always needed.
Terminology Cheatsheet
-- chatGPT: text-to-text
-- stable diffusion, mid journey: text-to-image
-- video/audio transcription, YouTube transcription: audio-to-text
-- tune-a-video: text-to-video
-- CLIP: image-to-text
-- all text-to-image above: They can all do image-to-image
-- * to *
Software Engineer, Web & Mobile | AI Integration Specialist
2 年I love seeing so much content related to prompt engineering with AI.
Product, venture building
2 年Come back to the green pastures of web3. ??