ChatGPT, Bard and Microsoft's vs Google's Product Strategy...
Open Games - Caspar Thomas

ChatGPT, Bard and Microsoft's vs Google's Product Strategy...

It's all about attention

The current Chat-GPT3 attention (pun intended) in the media and by the wider public is fascinating. There is a genuine buzz around the uses of Large Language Models, their uses and the expected impact on our lives in the near future. Equally fascinating is the wider market impact and some of the negative press concerning Google's initial release of?the Bard model.

see here for an initial media comment following Bard's release

It is important to put all this in context - this is an initial release in the public domain. Technology moves fast, often out of the public eye, and the media is always interested in sensationalist stories. So it is easy to forget that Chat GPT-3 was actually released around 2 years ago.

ChatGPT Initial performance was also error-strewn and unimpressive in certain cases. But there was not quite so much general attention back then... The big stories at the time were focused on the global impact of the Covid pandemic and whether blockchain-based cryptocurrencies would solve currency devaluation problems (spoiler alert...)

Timing is important

So, the important thing here is timing. Microsoft's Open AI investment is a strategic move, but more from a business and branding point of view, rather than technological innovation. Microsoft invested in a well trained and performing model already proven in the public sphere. This puts pressure on Google to respond by releasing Bard with a level of industry and public scrutiny Chat GPT-3 did not have to contend with back in 2020.

Google has the means to withstand the initial backlash & reputation damage - over time performance will increase and the technology gap can be corrected.

However, some impact to brand perception may linger. The public sphere is not generally focused on the intricate details of how these models work, so initial results and image carry the day.

Product based strategy bring dividends

All this is testimony to Microsoft's diversified and rejuvenated product-based strategy. Microsoft's dominance in the 90's was underpinned by a single-minded focus around the Windows OS driving a quasi-monopolistic revenue stream covering business software, consumer products, development tools and gaming. Over reliance on the 'Windows-centric' strategy which drove Microsoft's dominance became the company's greatest challenge when adapting to the changes brought by mobile and cloud computing eroding the desktop's dominance from different ends of the spectrum.

The dominance and reliance of Windows on the desktop was an impediment to Microsoft's ability to innovate and adapt to newer trends - but subsequent diversification of the revenue stream into Cloud and other areas has delivered a much more diversified ecosystem which can now be leveraged in product driven strategy across wider business and consumer markets.

In this context the Open AI / chatGPT investment is simply another piece of an overall vision - and Microsoft have a number of ways to use their latest tool. A "Teams" enabled roll-out is underway, Bing search will be GPT 'enhanced' (whether this is actually good for 'search' is debatable - but it will play well with consumers regardless, and drive brand recognition) ChatGPT will be integrated into Word and other office products to provide on-demand content generation for everyday tasks on the desktop and in the cloud.

'Business' and 'enterprise' applications for ChatGPT are not so immediately obvious given data privacy and governance concerns, but Microsoft have some interesting options here too. Code generation for DevOps slots nicely into GitHub (Microsoft owned) and associated code repositories, and is already a proven use case for ChatGPT3. Business data and a focused audience provided by LinkedIn is another area where LLM generated and curated content would be expected to provide gains and benefits.

In contrast, Google - for all the technical innovation do not seem to have such a cohesive consumer product strategy.

see here for the original paper on transformer based language models from teams mainly at Google Research and Google Brain "Attention is all You Need" which was one of the initial milestones in the development of the current Large Language Models

A sizable majority of Google's income still comes from search and associated data. Despite forays into genuinely innovative 'consumer' products, uptake has sometime been underwhelming - Google glass anyone?

Unified strategy, diverse initiatives

In many ways the challenge mirrors Microsoft's struggles with over-reliance on the Windows desktop market. Google drives plenty of innovation across widely adopted open source technologies, such as Kubernetes, the business has made inroads into the smartphone market, online email, office products and geo mapping - further supported by the recent Wayz acquisition - but the overall strategy remains very monolithic.

From the outside at least, the focus seems to revolve around wider data collection and trend monetization, driving the core advertising and search business rather than a commitment to diversification into the consumer markets themselves.

There are a number of lessons here. One is around the value of a diversified portfolio allowing flexibility of product strategy. Over-reliance on a single revenue stream, no matter how profitable is always a risk and a potential impediment to agility and flexibility to take advantage of new trends and technologies - even when you are actually building those technologies yourself (!).

Another point is that it doesn't matter if you are initially behind the game from a technology standpoint. As long as you have a clear end goal, ability to execute towards that end goal and are open to all approaches including external partnerships or acquisitions. Of course, having resources to sustain the pursuit is a big factor, and a timely acquisition of the size of OpenAI is not an option available to everyone - but nor is Microsoft the only player with deep pockets. Managing market perception is definitely worth the investment.

Be adaptable, but keep your focus

An further lesson from Google's timing of the initial release of Bard is to carefully consider reactive responses to competitive pressure and avoid letting external factors dictate your release timelines. Conversely, it is important to appreciate the evolving expectations of the wider market and appreciate how these impact perceptions. The finer details of technology, and especially its limitations, are often not well understood.

What is clear is that LLMs and associated generative AI tools are here to stay and will make a significant impact, not only on how we work but also on our expectation and appreciation of the value of content in general. It is inevitable that ever improving models will find their way into the ecosystems of all vendors - either through internal development, acquisition (as in the case of Microsoft) or partnerships and 3rd party white labeling - all of which will further justify investment in the space.

All to play for

The current state of play is still highly volatile. LLMs will be but one of a number of technological breakthroughs in the near future, not only in the field of AI/ML but also in areas like Quantum Computing which is fast becoming viable for 'production deployment'. Many of these developments have been long ongoing and the march of progress is relentless. New opportunities arise constantly and this reinforces the need for diversity in product lines and nimbleness to adapt to a fast-changing landscape.

Significant challenges still reside within the AI/ML space, despite the very visible success of the latest LLM developments. These models, for all their effectiveness, are very large (the clue is in the name) and hence very costly both in terms of compute and environmental footprint. This will become critical given trends and pressures around sustainable solutions. Beyond the impressive results and the sophistication of the architecture we are still dealing with what is at the core a very 'heavy handed' approach built around huge volumes of data and model parameters that now run into the trillions. Limitations of being confined to huge server clusters in the cloud and the cpu cycles required for training the models suggest interesting opportunities for novel approaches now that the wider public has acquired a taste for what these tools can provide.

The game is just beginning - keep paying attention.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了