Consumer AI

I am devoting the next three editions to look at the prospects for AI technology diffusing to the mass market. This is a good time to do this because all the main smartphone manufacturers have announced their plans for AI on their handsets, with Apple set to unveil their latest phones on 9th September.


In this first edition, I will share my thoughts on how the world of consumer-facing AI is evolving, what point of the technology diffusion cycle are we at, and who might be the big winners and losers in the short- and long-term. I will look at more details about how the handset manufacturers are packaging AI, specifically generative AI, to their consumers.


AI has been in your hand for years


Given the excitement about AI in the past 18 months, one would be forgiven in thinking this is a new development in consumer tech. The reality is that we have been using AI and Machine Learning for years now, it was just never marketed to us as AI. Smartphone manufacturers have been using AI to touch up the photos for years. They’ve just called it things like computational photography or Magic Eraser. Similarly, streaming services serve recommendations using AI and ML techniques but they call it something like “What others are watching”, or “Similar to…” And everytime you call your smart assistant, there are AI and ML algorithms working to translate your speech to instructions that the device can understand.


This begs the question of why these well budgeted marketing departments never embraced the word AI. I think it comes down to the marketing ethos at Apple. Steve Jobs famously re-oriented its marketing to focus on how technology could help consumers solve problems and move away from focusing on the performance aspects of the tech. So the iPod was about a thousand songs in your pocket, and not about the hard drive size. Your iPhone has all day battery life, not how much battery capacity as measured by mAh. This is the classic Marketing or Sales 101 of focusing on benefits not on features.


This ethos has spread somewhat throughout tech such that outside hard core tech reviewer circles, you don’t hear about megapixels of your camera, RAM in your phone or mAh of your battery.


I think tech companies are also worried about the public expectations when it comes to the term AI. At one extreme, people think of AI as a terrifying robot like the Terminator or the robots in the Alien franchise who use their power against humans. At the other extreme, you have the idea of the bumbling home assistant. Think about the memes and jokes about Siri or Alexa not understanding basic commands. The public might also be less accepting of AI if they were to realise how much of their own data was used to train the AI.


And then comes ChatGPT


The big shift came with OpenAI’s public release of ChatGPT running off GP3.5 at the end of 2022. That became the fastest adopted tech product of all time, hitting 100 million users in a matter of months. People were willing to overlook its hallucinations in favour of the ease of use and the ability to carry out what seemed to be almost human conversations. The consumer market now had a friendly face to pin onto AI.


ChatGPT’s launch ushered in the current AI boom. Companies such as Nvidia that were seen to have AI firmly in their business strategies skyrocketed in stock price or valuations. Companies, fairly or unfairly, that were seen as lagging in the AI race were punished by investors. Think Google’s botched launches of Gemini or Microsoft’s unfortunately names Recall feature. Even Apple had to feature AI in its marketing although they only did so after branding it Apple Intelligence.


Who are the real winners and losers?


At the moment, the big winner is Nvidia with its technological lead in the high end chips that are needed to train the large language models (LLMs) that power much of the generative AI services. This lead seems to be secure in the short- and medium-term given the considerable technological barriers to designing these chips.


But over time, we should see other chip companies either catching up or focusing on chip designs that are less resource-intensive, cheaper to run and good enough for certain specialised use cases. While not a foregone conclusion, Nvidia could find itself in that innovator’s dilemma where its profits from the high end chips blind it to strategic incursions from the low end. It could also fall prey to changing investor appetites as AI tech investment moves away from the picks and shovels to the killer app of AI, whatever that might be.


It is also obvious that only a handful of companies will have the resources to make the foundational LLM models. These models require huge amounts of data and energy to run and train. This is not a one-off effort as the models have to keep being trained on current data. So far, it is unclear how much the AI companies have paid to content creators for the use of their training data but we should expect to see this becoming more significant in their cost structure over time. The bottom line is that we won’t see startups growing in this space. In that regard, the foundational LLM market will not develop like the PC industry in the 80s or the internet in the 90s — it will be the established firm that dominate. Even OpenAI was well funded by Microsoft to train its GPTs.


There is still space for innovative startups that tap into these foundational models and better package the user experience or focus on particular market niches. The risk for these companies is that much of their cost structure depends on the foundational LLM partner. The ability of these startups to deliver customer value so that they can pass on price increases in the future will be key. Look for most of these companies doing away with, or severely handicapping the “free” part of the freemium model.


And then we the companies that control the distribution channels to the mass consumer market. These are the handset manufacturers or the ones who have App Stores. I will look at those companies in the next edition. Subscribe to this newsletter for that edition.

要查看或添加评论,请登录

Conrad Kheng Hwa Chua的更多文章

社区洞察

其他会员也浏览了