Don't Invest in AI Assets: How To Benefit From AI
ChatGPT's meteoric rise has shown that AI tools are ready for mass adoption, but how should you position your business to take advantage of them?
Don't Try To Own The AI Models
It can be tempting to own the AI systems by building and keeping secret the code and their parameters.
But doing so is extremely expensive. Consider that OpenAI, Meta, Google, AWS and a host of smaller start ups are spending hundreds of millions of dollars on trying to create the best models that exist. These companies have the resources to hire the very best talent in the world, and access to data to train their models on. It will be costly to compete against them.
On top of this, many companies such as Meta and leading universities, will often open source their models. This typically makes their free to use. Imagine spending hundreds of millions for a machine learning model, only to find that Meta has given away a better one for free on their website!
Owning a secret AI is a rapidly depreciating asset that can become worthless overnight.
Don't try to own the compute
NVidia has been one of the biggest winners from the rise of AI.
Their GPU chips power huge data centers of AI systems, training and crunching the numbers to create better, more powerful AI's. Their chips are in such high demand that they are now very expensive and hard to come by.
Yet owning the compute is also a trap.
Unless you are able to create your own chips to rival Nvidia's, you will always be needing to buy newer and better chips. If your business model is to rent out these chips to others, you'll be competing with big players like AWS, who already have extensive data centers and have figured out how to make it profitable.
But the biggest threat to owning the compute is the open source community.
As compute becomes more scarce and expensive, it starts to encourage open source contributions to invent models that don't need large compute clusters. Researchers with limited budgets are starting to experiment with ways to train models that don't no longer need huge data centers.
领英推荐
If you invest in the compute you may end up with stranded assets, objects that perform poorly compared to newer models and may not even be necessary to run the next generation of AI systems.
Own the small data and the Problem Instead
For all of their amazing capabilities, AI's systems like ChatGPT can be pretty useless on their own.
A more profitable route is to just apply AI to a particular niche. In this approach your business maintains a small dataset used to fine tune an AI system, and that AI is applied to your specific problem domain.
A good example of this is model Canva. Canva is an online social media creation tool, that lets its users make pictures or videos for marketing. Canva has an embedded an AI that lets their users create images of anything they want just describing it with text. This makes the Canva product even more useful as sometimes users won't find the perfect image in their existing catalogue.
Following this model companies can leverage their existing market channels and expertise to supercharge their offerings with AI.
By aiming to control the market and their small data sets these companies can avoid the huge costs of AI but still capture its benefits.
Data Without Execution Is Worthless
Some business owners feel that they should horde their data, because they think its valuable.
If you're company isn't a technology company (for example a mining or a consulting company), don't try to hold onto your data. Instead use it a bargaining chip to partner with a technology company to help you.
Having a big pile of data is the same as having a big pile of dirt, totally useless unless you know what to do with it.
-- This article was originally published in Business News Business News WA
Software Development Engineer | DevOps
7 个月Although the the article is for the business competing tech gaints, as a robotics persuer, is it better to leverage open source pretrain models for the perception for the robotic systems rather putting so much time in learning the in-depth architecture and hidden layers? But concentrate more on the features and applications of the system in our learning path. what is your view on the above query Dr. John Vial ??