How To Think About Generative AI? Part?3
Infinite reflections?-?a Probabilistic Mirror by MidJourney

How To Think About Generative AI? Part?3

Eight Long-Term Trends Amidst the Short-Term Hype Cycle — Part 3

This is part three of a four part series with the following parts due to release every week here on?Linkedin?and?Twitter. Follow us to stay updated or share any feedback!

Last week, we covered how fine-tuned vertical LLMs will see massive adoption by businesses across the spectrum and how this presents a unique opportunity for a long list of AI services that startups to capitalise. We also touched upon SaaS startups that incorporate AI into the apps they are building and how they can create a moat in the industry. In case you missed any of this read last week’s blog here.?Now onto the next two trends!

Trend 5: Marginal Cost of Intelligence will approach?Zero

Remember in part 1, how we compared today’s AI revolution to 18th century’s Industrial Revolution — with the crux being that what Industrial Revolution did for manufacturing, AI will do for services?

Well, for the sake of recap — for the first time ever, AI will allow services to scale disproportionately. A call centre with the right virtual assistants trained on the right data can handle 10x the calls without needing 10x the staff. A copywriting agency can write 10x the Copy without scaling its staff 10x. The examples sound like a dream, but their second order effect is even more unthinkable — that marginal cost of intelligence will rapidly approach zero.

To understand this properly, we need a deeper understanding of?why?will MC of Intelligence approach zero and what chain of affects will it cause.

To know Why, Understand this

People often call LLMs a?stochastic parrot. While this may be technically true due to the randomness of outputs as well as the nature of LLMs to act as per their training data, this term is mostly used in critical context.

However, we feel the opposite — LLMs being a stochastic parrot is not a negative but rather a positive. Let’s redefine the way we look at them.

Probabilistic Mirrors, not Stochastic Parrots

Mirrors reflect back exactly what’s in front of them, right? Now imagine standing in front of The Mirror of Erised from Harry Potter— the mirror that reflects what you truly desire. AI is kind of like that — a probabilistic mirror — that tries to guess what you want to know and tries and throws back at you what it thinks you are looking for.

No alt text provided for this image
Digital Art - The Erised Mirror

Picture this: you’re standing in front of this special mirror and instead of showing just one reflection, it shows multiple reflections, each slightly different but related to you. That’s where the “probabilistic” part comes in. In context of AI, it means that AI reflects back patterns based on the vast amounts of data it’s been trained on, but with an element of randomness. The AI can offer multiple possible responses to the same prompt, hence mirroring a variety of outcomes. At a philosophical level, this idea really pushes us to think about the essence of AI — a creation that doesn’t have its own thoughts or feelings but rather mirrors back the linguistic world it has learned, with a dash of unpredictability. It’s a reflection of our human language and interactions, yet colored by the randomness that the vast expanse of data introduces. It’s not just mimicry — it’s a probabilistic echo of our human complexity, an intriguing dance of determinism and chance.

The fact that AI is a stochastic parrot, ahem ahem, we mean a probabilistic mirror, means that it can create several strata and class of output within seconds, only with minor discrepancies due to its randomness. Basically, anything that can be summarised as input and output of data/information could potentially be recreated by AI — fast and cheap. This is what brings the marginal cost of intelligence closer to zero. Mind you, we don’t claim a complete zeroing of the cost of intelligence where every service is cheap. But yes, the reduction in cost of intelligence will definitely be visible in the years to come.

No alt text provided for this image


With vertical LLMs trained on specific fine tuned data, the quality of output will improve so much so that even in sensitive fields, cost reduction will be evident. Does this eliminate doctors, lawyers, teachers, accountants and consultants? No. But like Vinod Khosla’s?20% Doctor thesis, we may see a 20% Lawyer, 20% Teacher, and so on.

In fact, between now and 2030 — the reducing cost of intelligence trend will unfold in various sectors step by step on the back of Layer 4 (AI apps) built using Layer 3 (Tooling/LLM Optimisation Services). Let’s understand which sectors will see impact first.

No alt text provided for this image
Adoption within an industry will also vary depending on need for customisation vs cost of going wrong for specific tasks.


Reduction in cost of marginal intelligence will enable knowledge services and coaching to be accessible to all, pushing the boundaries of what humans can achieve - forever.

Trend 6: Gold Rush is a Shovel Seller’s?Market

Historically, wherever there has been a gold rush, it has made the town’s shovel sellers rich.

Here’s what we know:

  • Every company will use an LLM — either bought or built.
  • This applies to text to image and video models too. Eventually, all companies will use a model for their internal or client facing use-cases.
  • This will create a major stream of opportunities for service providers and stakeholders in the AI value-chain. The best of this will come to Layer 3 of the AI stack — which is AI Tooling.

Layer 3 is anything above an LLM/AI Model and below a consumer/end-user AI app. Layer 3 helps companies win the AI gold rush. Its the modern day shovel.
No alt text provided for this image
In the AI Gold Rush — Layer 3 is what provides the shovels.

Layer 3 of AI Stack is Well Positioned

  • Layer 3 is not as capital intensive as Layer 1 (Building GPUs or foundational hardware) or Layer 2 (Building LLMs and AI Models from absolute scratch, like GPT or LLaMa)
  • Layer 3 does not have a high incubation period embedded with research and development costs like Layer 1 or 2.
  • Layer 3 has a relatively lower technical-replicability risk than Layer 4 (AI apps), which may be prone to simple API calls and is highly model dependent.
  • Layer 3 does not face tough distribution challenges like Layer 4 which compete in the SaaS market surrounded by giant incumbents.
  • In fact, Layer 3 does not carry as much the risk of being replicated by an incumbent as Layer 4 does — while Canva or Notion adding AI to their stack may kill multiple AI designing or writing startups, Layer 3 is comprised of tooling and service startups that specifically cater to the AI segment. Their market is new, its a raw untapped field, an undrawn canvas.
  • Layer 3 is driven by B2B enterprise sales, which helps keeping CAC lower and revenues more predictable.

Layer 3 is already seeing wins. MosaicML’s acquisition by Databricks for a whopping $1.3B is a major example. Both LangChain and Pinecone seem poised for creating huge shareholder value too. Countless other examples would include McKinsey and Accenture buying Iguazio and Flutura respectively, both with the intent to build better AI applications for their consulting clients looking for AI specific interventions.

This is part 3 of a 4 part series on long-term trends in Generative AI. Stay Tuned for the next post!?If you are building in this space or need to bounce off ideas, we are happy to chat.

Thanks to?Sibesh?from?Maya?and?Rohit?and?Harish?from?Segmind?for helping proof read this!

Connect with the authors:

Kushal Bhagia (LinkedIn,?Twitter, kb[at]allincapital.vc)

Sparsh Sehgal (LinkedIn,?Twitter, sparsh[at]allincapital.vc)

要查看或添加评论,请登录

Kushal Bhagia的更多文章

  • All In on India - All Over Again!

    All In on India - All Over Again!

    Hello World! Today - I’m beyond excited to drop some big news: we’ve launched a $25M second fund at All In Capital to…

    156 条评论
  • Announcing All Stars! ???

    Announcing All Stars! ???

    In Jan next year, I will complete 5 years as a pre-seed VC. Having now partnered 130+ companies and having seen a lot…

    55 条评论
  • How To Think About Generative AI? Part?4

    How To Think About Generative AI? Part?4

    Eight Long-Term Trends Amidst the Short-Term Hype Cycle — Part 4 This is the final part of our series on Generative AI…

    1 条评论
  • How to Think About Generative AI? Part 2

    How to Think About Generative AI? Part 2

    Eight Long-Term Trends Amidst the Short-Term Hype Cycle — Part 2 This is part two of a four part series with the…

    5 条评论
  • How to Think About Generative AI?

    How to Think About Generative AI?

    Eight Long-Term Trends Amidst the Short-Term Hype Cycle — Part 1 This is part one of a four part series with the…

    14 条评论
  • Dumb Things First Time Founders Do #2

    Dumb Things First Time Founders Do #2

    For most founders, fundraises are an extreme waste of the most valuable resource you have - your time!! Founders have…

    19 条评论
  • Hello World!

    Hello World!

    I am thrilled to announce the launch of our new early-stage VC firm - All In Capital. Backed by some of India’s best…

    141 条评论
  • First Cheque : A 2020 Review

    First Cheque : A 2020 Review

    Pheeeww!! What a year. I don’t think any of us saw that coming ?? What started as any other year ended up putting most…

    38 条评论

社区洞察

其他会员也浏览了