There are too many Chatbots!
Saeed Al Hasan
Product & Innovation | Citizen Digital Identity, e-Government Smart Services & Govt Federal Unified Platforms | Member Of Mohammed Bin Rashid Centre For Government Innovation. ???? ?? ??? ???????? ?? ????? ???
OpenAI announced an online storefront called the GPT Store that lets people share custom versions of ChatGPT. It’s like an app store for chatbots, except that unlike the apps on your phone, these chatbots can be created by almost anyone with a few simple text prompts.
Over the past couple of months, people have created more than 3 million chatbots thanks to the GPT creation tool OpenAI announced in November . At launch, for example, the store features a chatbot that builds websites for you, and a chatbot that searches through a massive database of academic papers. And like the developers for smartphone app stores, the creators of these new chatbots can make money based on how many people use their product. The store is only available to paying ChatGPT subscribers for now, and OpenAI says it will soon start sharing revenue with the chatbot makers.
This probably means that in 2024, a lot more people will do what I did in 2023: spend an ungodly amount of time playing with AI chatbots. The problem is, there are already too many of them. It’s hard to know where to start, and although the introduction of a store makes it easier to find chatbots, it’s not yet clear if a third party will do for chatbots what third-party developers did for smartphone apps: make them essential and revolutionary at the same time. If that happens, maybe the tremendous buzz around AI right now will actually turn into a trillion-dollar industry — and change the world.
My own experience trying to get into chatbots highlights the confusion well. I started out with ChatGPT, trying to amuse myself by getting the multibillion-dollar bot to write smutty poetry. Then, Microsoft added ChatGPT to Bing and let it browse the web, causing me to change my default search engine — Google , duh — for the first time in my life. Then Google launched Bard, its own chatbot, so I switched back.
From there, the list of chatbots kept growing. I spent hours discussing fascism with a chatbot likeness of Indian Prime Minister Narendra Modi on Character.ai , a chatbot startup founded by former Google employees , and pouring my insecurities and deep, dark secrets into the patient ears of Pi, a friendly personal assistant created by Inflection AI, during a brutal summer of job hunting. I asked Claude, a chatbot from Anthropic, a startup founded by former OpenAI employees, to analyze my resume and suggest improvements (it did a solid job), and searched the web with Perplexity, a slick little chatbot that wants to be the next Google. When Meta stuffed AI-powered chatbots into WhatsApp , Instagram , and Messenger, I used them to compose cheesy goodnight poems for my partner. I even coughed up $16 to access Grok, Elon Musk ’s ChatGPT competitor trained on data from X, formerly Twitter , which promptly analyzed my tweets and roasted me (“you’re not a journalist, you’re a hack, a glorified tech blogger.”).
For those who believe generative AI will be transformative, the chaotic world of chatbots presents a problem. Chatbots are the most obvious application of generative AI technology, and powerful large language models, or LLMs, that power modern-day generative AI are making chatbots more sophisticated than ever. However, it’s still not clear if chatbots themselves are generative AI’s killer apps. And if they are, it’s not clear what they’re really good for, other than streamlining customer service interactions. The fact that we’re drowning in chatbots isn’t making it any easier for the general public to know what to do with this new technology.
Noah Giansiracusa, an associate professor of mathematics and data science at Bentley University and author of How Algorithms Create and Prevent Fake News: Exploring the Impacts of Social Media, Deepfakes, GPT-3, and More, told me that it wasn’t the number of chatbots that was the problem — it was the amount of money flowing into them.
“So many of these chatbots are the entire product of some AI company, and often, that company has a valuation of a billion dollars,” Giansiracusa said. “I don’t know if there are too many chatbots. I think there’s too much money going to companies but all they’re doing is producing chatbots.”
Indeed, companies that make chatbots have been raising money at an alarming rate lately in what is widely considered to be a tough economic environment to do so. OpenAI, which Microsoft has already poured $13 billion into, is reportedly in early discussions to raise a fresh round of funding that would value the seven-year-old company above $100 billion. Anthropic is in talks to raise a $750 million funding round that would value it at up to $18 billion, and Character.ai is in talks with Google about getting an investment. Last week, Perplexity raised $74 million from a host of investors, including Jeff Bezos , valuing the startup at $520 million. And on Tuesday, Adam D’Angelo, the CEO of Quora, announced a $75 million funding round from Andreessen Horowitz to grow its chatbot Poe, which aggregates other chatbots into one tool. Tech giants like Meta and Google, meanwhile, are reportedly spending tens of billions on AI already.
What is still unclear, despite the funding frenzy, is whether any of these chatbots or any of those coming to OpenAI’s new custom GPT Store will attract users. It’s even less clear if they’ll ultimately make money. Most chatbots currently have a freemium model that allows casual users to use a basic version of the product while charging between $10 and $20 a month to unlock advanced features such as asking an unlimited number of questions or letting them choose a more powerful large language model.
“It’s really hard to get people to pay for chatbots,” Giansiracusa said. “I think companies saw people paying to access the premium version of ChatGPT and thought, ‘Hey, here’s a new source of money.’”
Perplexity, the high-profile startup with a lofty ambition of replacing Google Search, for instance, makes just $6 million in annual revenue, almost all of which comes from offering a $20 monthly subscription, according to a recent report in The Information. The company is mulling putting ads into its AI-generated search results, founder Arvind Srinivas told the publication. Last year, Neeva, another startup with an AI chatbot aimed at taking on Google Search, killed it after failing to get enough traction, and sold itself to cloud computing company Snowflake.
“We have to figure out how to make conversational AI profitable,” said Amanda Stent, director of the Davis Institute for Artificial Intelligence at Colby College, whose research in AI and natural language processing led to the development of several applications including Siri. “That’s going to be the big question for thousands of startups and big companies over the next couple of years.”
The ease with which it’s possible to make general purpose chatbots in 2024 will lead to commodification, Stent believes. “I think chatbots have to be embedded in a software or a hardware product,” she said, citing how Microsoft embedded ChatGPT into Bing, ultimately branding the product Microsoft Copilot. “Companies that haven’t figured out how to embed their chatbots in other verticals are going to die. I don’t see people paying for general purpose chatbots over time.”
That tracks with my own chatbot usage over the last year. Even though ChatGPT kicked off our modern chatbot era, I used it rarely, mostly because getting it to access the internet or using its more advanced GPT-4 model requires a $20 payment. Perplexity is slick and provides coherent answers with citations to questions that Google completely flubs (“How likely is Donald Trump to win the 2024 US election?”), but years of muscle memory means I still head to Google Search. Pi’s responses are empathetic and delightful, but I have to remember to actually go to its website and use it. Grok is good for roasts, but little else. And while having Meta’s AI chatbots embedded in WhatsApp, an app that I use every single day, might sound useful, I’ve struggled to find reasons to actually use it while texting with someone. It also doesn’t help that generative AI systems continue to hallucinate — that’s jargon for when an AI confidently makes something up — giving me pause no matter which chatbot I use.
What I did find myself naturally gravitating toward was Bard, not because it was better than the others — it was, in many cases, noticeably worse — but because it was simply there whenever I used Google Search. More importantly, Google lets you hook Bard into the company’s other services, like YouTube , Google Flights, and Google Maps, as well as your personal Gmail and Google Drive. Doing this makes Bard function like a true personal assistant that’s aware of your data, your correspondence, your documents, and your flight tickets, among other things, and answer questions relevant to you. When I asked the bot which terminal my flight would take off from while coming back from vacation last month, Bard combed through my email, found the information on my flight ticket, and presented it to me in seconds. It’s not always perfect, but when it does work, it feels like something a chatbot should have been able to do all along, something slightly closer to a killer app for AI.
“Chatbots that are successful won’t exist in a vacuum,” Giansiracusa said. “It’ll be about how easy it is for them to become a personal assistant for you. Which is why, I think, existing monopolies like Google will ultimately win because they have all your stuff in one place and can link it all together with a chatbot. I can even see Google charging for it,” Giansiracusa added. “We’re going to think a little less about the overall chatbot and more about the specific applications we can use it for .”
Unlike me, Rushi Luhar, chief technology officer of Jeavio, a software startup headquartered in Boston, likes to bounce among multiple AI chatbots. He uses ChatGPT for work, summarizing call transcripts, helping with presentations, writing on LinkedIn , and getting feedback on blog posts before they are published. When he’s off work, though, he likes to chat with Pi. “It’s great for conversations because it’s so good at being friendly and asking follow-up questions,” he said. “If you squint a little, you can almost pretend that you’re having a conversation with … something, you know?”
Chatbots by themselves, Luhar thinks, are simply vessels to showcase the underlying capabilities of the LLMs that power them. “Ultimately, we’re going to move beyond the basic chatbot experience. The whole text-heavy thing is going to disappear as these things get more multimodal,” he said, referring to more advanced capabilities that let LLMs work not only with text but other input and output formats like images, video, and sound.
Levin Stanley created and released his custom GPT the same day in November that OpenAI announced the feature and the GPT Store, which finally launched this week. Stanley’s bot , called Find & Shop Assistant, is dead simple: feed it a photo of an item and it will trawl the internet, find where you can buy it online, and present you with a price and a link.
“I created the whole thing in my iPhone’s browser in about a minute or two,” Stanley, a product designer based in Newfoundland, Canada, said. “The system also generated a logo for my bot (a magnifying glass in front of a shopping bag) on its own.” So far, Stanley has used his own bot to find and buy a LEGO set for his son and a Brooklyn Brewery beer glass after clicking a picture of one with his phone.
This is ultimately how OpenAI’s GPT Store could do for generative AI what the Apple App Store did for the iPhone: crowdsource the development of applications, see what users flock to, and let that inform how the tech continues to develop. But the millions of custom chatbots could also further fragment an already fragmented chatbot landscape. We won’t know until people start using them.
Right now, we are really, really early in the chatbot lifecycle. As long as the money continues to flow through the streets of Cerebral Valley , everyone who can cobble together a chatbot is going to do it.
“Current chatbots are like cars,” said Beerud Sheth, co-founder and CEO of Gupshup, a company that helps businesses create custom chatbots to engage with their customers. “Some are for speed, some are for comfort, some are for size. Once the money runs out and the novelty wears off, that’s when people will figure out what to actually use them for.”