AI Is Eating Software: 4 Strategies to Build AI-Resistant Products in the Age of OpenAI and Big Tech Disruption

AI Is Eating Software: 4 Strategies to Build AI-Resistant Products in the Age of OpenAI and Big Tech Disruption

Introduction

What is a General-Purpose Technology? General-purpose technologies are technologies that can be used across many industries and sectors, have the potential to increase efficiency and productivity in existing industries, and probably create new industries. Some examples of general-purpose technologies are the internet, electricity, computers, and the steam engine. The computer is a general-purpose technology. In 2011, Marc Andreessen claimed in an article that "Software Is Eating the World.” This is true. The biggest direct marketing platform is a software company - Google. The biggest bookseller is a software company—Amazon. The biggest recruiting platform is a software company—LinkedIn. You probably know the story of how Netflix, a software company, disrupted Blockbuster. We now have banks with no physical branches. So yes, software has eaten the world. AI is also a general purpose technology. It can be integrated into a wide range of industries and sectors. AI has or can be applied to industries like healthcare, finance, transportation, manufacturing, retail, education, etc. What Marc Andreessen probably didn't envision was that, in 2023 and beyond, AI is going to eat software. AI will radically change the way we build enduring software products.

ChatGPT is a killer app. In one week after launch, it racked up about 1 million users. Two months later, it reached over 100 million monthly active users. This is impressive! This information is both a blessing and a curse. It is a blessing for OpenAI (millions of users and billions in revenue) and a curse for the tens or hundreds of startups ChatGPT made obsolete. There are three categories of startups or products that ChatGPT and Big Tech have made or will eventually make obsolete.

Three Categories of Products Vulnerable to Obsolescence

  1. Single Use Case Feature - The obvious example is Grammarly. It existed long before the launch of ChatGPT. The company was founded long before OpenAI was founded. They found product-market fit. I use it. They have millions of users. Grammarly is everywhere. They have a web application, a mobile application(a keyboard), a Chrome extension, a Google Docs extension, and a desktop application. This is great, but there comes a time when disruption and obsolescence comes knocking. This happened to Blockbuster. This happened to Kodak. I think this might eventually happen to Grammarly. Let's go back in time to the 1980s. Did you know that there were companies that sold standalone spell checker applications? One company was Random House which catered to the pre-Windows PC market. These applications ran alongside word processors. Eventually, the demand for standalone spell checkers declined when word processing software like Microsoft Word integrated these features. Before the rise of LLMs, the most recent techniques a company like Grammarly would have used include advanced natural language processing and machine learning techniques like contextual word embeddings (BERT), sequence-to-sequence models with attention mechanisms, and smaller-scale transformer models. They probably combined rule-based systems with machine learning, pre-trained models on large corpora, and employed advanced statistical parsing. Then human and user feedback improved the model performance. Now with the rise of LLMs, startups like Notion have integrated these features into their core product through an API from companies like OpenAI. Recently, Apple announced that they were going to launch brand-new systemwide Writing Tools built into iOS 18, iPadOS 18, and macOS Sequoia, users can rewrite, proofread, and summarize text nearly everywhere they write, including Mail, Notes, Pages, and third-party apps. Basically, a free Grammarly integrated into the operating system. Last year, Microsoft announced the integration of LLMs into all Microsoft Office applications like Word, Excel, PowerPoint etc. In March 2023, Google announced the integration of Generative AI into all their applications like Gmail, Google Docs, Google Maps etc. Since startups have integrated Grammarly into their core product and Big Tech has done the same thing, where is the need for a standalone Grammarly? I love Grammarly, but selling a feature is no longer an enduring strategy in the age of AI.
  2. OpenAI Thin Wrappers - These are startups that built a product by leveraging OpenAI APIs. Startups like Jasper.ai and Copy.ai built copywriting and marketing tools, raised millions of dollars, hired many people, and spent a lot on marketing to acquire users. My sister used Jasper.ai, and she knew about Copy.ai. In November 2022, OpenAI launched ChatGPT. She found out about ChatGPT and stopped using Jasper. ChatGPT launched as a free product. How were these startups going to compete against free? I think the problem was that they didn't know OpenAI was going to launch a consumer product. To be honest, there was no differentiation: text input and text output. The last I heard, Jasper laid off some of their staff and Copy.ai pivoted (pivoting is not bad).
  3. OpenAI Wannabes - They are basically OpenAI clones. They raised hundreds of millions of dollars. They have an impressive team of research scientists and engineers. They built their own model (no wrapper), probably not as good as GPT-4o, Gemini, and Claude. Their business model is to sell access to their model through an API and a chatbot. I don’t think the strategy is sustainable. A model is not a moat and there is no differentiation. It is expensive to build. In less than a year after building, it becomes obsolete. They need to always invest in new models, which become obsolete a year later. This is not the type of business that can be funded by venture capitalists because after raising a version one that nobody uses (I focus on the latest open-source and frontier models), they need to raise money for a version two that nobody will use. Another strategy would have been to be acqui-hired by Big Tech, but how many will be acquired? Inflection raised over a billion dollars and built a chatbot called Pi. The traffic was ridiculously low. How many chatbots does one need? Eventually, Microsoft hired the CEO, the team, and paid back money to the investors. It was not a normal acquisition. This happened to Adept too. They raised over 400 million dollars. They probably couldn't raise enough to build new versions of their models and Amazon hired the team. The same acquisition trick. The reason is that the antitrust regulators in the US and Europe were going to block the acquisition because bIG tEcH is bLoCkInG iNnOvAtIoN. Let's talk about clustering and winner-take-all market. Since you are reading this article, you know Uber—the ride-sharing app. I researched, gathered, and analyzed the data about the Uber clones. This is the summary: 47 clones, 27 existing, 13 collapsed, and 7 acquired. There is a saying that history doesn't repeat itself, but it rhymes. I am here to break it to you that in the industry of building AI models history will not rhyme. Let me explain why. The reason there are 27 existing Uber clones is because of the dynamics of the ride-sharing business. A startup will raise money and focus on their local or regional market. The riders in Lagos only care about the drivers in Lagos. The riders in London only care about the drivers in London. The clustering is local. A startup in Lagos can give Uber a run for its money since they only care about Lagos drivers and riders. Compare these with chatbots and LLMs. Users from all over the world have access to ChatGPT. This is a winner-take-all market. OpenAI partnered with Bain and Company. Cohere partnered with McKinsey. Microsoft partnered with McKinsey. This is a great distribution strategy. The consulting firms deploy AI for their clients on behalf of the AI providers. If your startup just raised money to train your first or second version of your model, you are probably too late.



How to Build an Enduring Product

  1. User Experience - First of all, the core of your product should not be a chatbot. ChatGPT has a performant free tier. Claude has a performant free tier. Gemini has a performant free tier. Microsoft Copilot has a performant free tier. You are competing against free. Whatever product idea comes to your mind, make sure the core of your product is not a chatbot. Did you not see the use cases of Artifacts, the feature launched by Anthropic? When I started seeing the different and impressive use cases, I thought to myself, “God, I have to think harder.“ I don't have a choice but to think harder, because if I don't think, these big startups will make a mess of me. So far, it seems like OpenAI, Anthropic (Claude), and Google (Gemini) are investing in chatbot interfaces that can do everything. Based on that insight, think and rethink how your product can’t be replicated inside a chatbot interface.
  2. Outcome-Driven Design - The output of an LLM is ephemeral. You give an LLM an input, and it returns an output. Yes, you can store the response (ChatGPT does so), but you most likely don't need the information again when you are done with the task you needed it for. Build a product that focuses on the outcome, not just the output from an LLM. An example is OpenAI GPT-4o language demo or Google Translate versus Duolingo. I have a friend who can speak French (he is Nigerian, and our official language is English) and he uses ChatGPT as a conversation buddy. OpenAI is for people who know enough about a language and need a conversation buddy or a tool that helps with navigating a new country. Duolingo is for beginners who want to learn the language. Duolingo doesn't just provide the curriculum (something ChatGPT can provide); it also tracks your progress.
  3. Process Knowledge - If your product can be generated through zero-shot or few-shot prompting by ChatGPT users, I don't think you have an enduring AI product. This was a problem with products like Jasper.ai and Copy.ai. The core of their products was the LLM. Build a product that requires multiple prompts (or fine-tuning), integrated by software engineering, and unique process know-how. An input field and output page do not constitute software engineering. This means AI is not the only component that makes up your product. Consider podscan.fm as an example. This service monitors podcasts for mentions of your product. The application consists of several components: one transcribes audio to text and uses natural language processing (including LLMs and named entity recognition) for content analysis. This particular component uses an LLM. Another component gathers podcast episodes from various sources, while a third sends customized alerts and notifications to users. The latter two components require robust software engineering.
  4. Unique Data - The innovation with LLMs is that the marginal cost of content creation is zero, making personalization cheaper. The limitation of LLMs is that they depend on data. Your moat is unique data that frontier models don't have access to. If you can curate complex data, fine-tune an off-the-shelf open-source model, and then build a product around the curated data, you will have built a unique product.



Conclusion

This article provides a framework for thinking about building an AI product in the face of competition from OpenAI and Big Tech. I am working on an AI product (you can subscribe to this newsletter and be informed when I launch), and this framework guides me on which features to build or if I need to think harder.

In the face of disruption and obsolescence brought by ChatGPT, AI startups must confront the realization that their initial strategies might not suffice. The rise of ChatGPT has redefined the landscape. These startups must rethink their approach. Instead of clinging to outdated paradigms, they must embrace what can be. By leveraging open-source models (in LLaMA we trust) and frontier models, these startups must dismantle their obsolete product sense and rebuild. They should create innovative solutions teeming with potential and high user engagement. In this new era, AI startups can thrive, not by resisting change, but by adapting and using the tools at their disposal to build a future that meets the evolving needs of users.

Sanjeev Aggarwal

Director at Hanabi Technologies

3 个月

Hey Iyanuoluwa Ajao You should definitely try Hana. She's more than just an ordinary AI bot—she's an assistant team member who can customize everything for you and function just like a real team member or an assistant. Check out this video to know more about Hana: https://youtu.be/KdUQsuM2XI4?feature=shared

要查看或添加评论,请登录

社区洞察

其他会员也浏览了