A look at the OpenAI Developer Day announcements?

A look at the OpenAI Developer Day announcements?

OpenAI shows no sign of slowing down, and at its recent developer conference it announced that the company has reached over 100 million weekly users. Additionally, over 2 million developers are currently utilizing the company's API to build applications, and this is sure to grow with the announcements it made at the conference.?

There were announcements of significant improvements made to the OpenAI models as well as an amazing raft of other improvements. ?

The new GPT-4 Turbo was a significant announcement. This provides a context window of 128,000 tokens which in-context provides four times the context length of?GPT-4’s context window. What does this mean ? It means that when writing substantially longer content such as essays, stories or research papers ChatGPT will be able to retain focus and logical flow and be better able to recognize and understand references to past interactions.?

As an example, the average novel contains around 100,000 to 120,000 words. Since GPT models use subword tokenization that breaks words into multiple tokens, a rough estimate is 1 token = 5 letters/characters in English text. So, 128,000 tokens would be around 640,000 characters. At 5 characters per word, that's about 128,000 words. So, GPT-4 Turbo's context window has capacity for a full-length novel's worth of information.?

GPT-4 Turbo also has new modalities, as well supporting text analysis it can now also interact with both text and images, in which the latter uses DALL-E for image creation (which has been upgraded to DALL-E 3). It now also the ability to reply with human-like voices to text input (I can’t help but think of HAL for 2001 Space Odyssey as I write this!).

Whisper AI, OpenAI’s open-source speech recognition model was also announced as being updated to Whisper large v3. Whisper v3 is built on a state-of-the-art Transformer sequence-to-sequence model and the update features improved performance across languages with a promise to?support Whisper v3 within the OpenAI?API ‘s in the near future.? ?

Turbo also gives the game away with regards to speed. At the keynote Sam Altman said GPT-4 Turbo it was faster, much faster, which is good news because I always found using GPT-4 kind of sluggish compared to GPT-3.?

Additionally, it was announced that the model's knowledge has been brought up to date from September 2021 to April 2023. The nearly 2 year knowledge gain helps GPT-4 Turbo provide more relevant, accurate and current responses as it can better understand the context and respond more accurately to prompts that involve recent people, places, events etc. I also wonder whether the continuing advances in AI/ML since 2021 means the training process itself was more robust and optimized resulting in a much more powerful model.?

Probably the best news for those companies that use it is pricing. GPT-4 turbo is three times cheaper than GPT-4. Yes, you read that right, three times cheaper! So not only has performance increased but pricing has decreased.? ?

OpenAI also introduced something they?call Copyright Shield to protect customers from copyright infringement claims. The program promises to cover the legal costs if a business is sued for intellectual property violations over content generated by OpenAI's publicly available developer platform or ChatGPT Enterprise product. OpenAI says it will pay the legal fees for defending customers who use these tools as intended and are targeted with copyright lawsuits over the AI-produced material.?

Microsoft were also front and center at the conference. Satya Nadella was up?on stage with Sam Altman during his keynote, announcing that Microsoft is committed to building the best system and infrastructure for developers using OpenAI APIs.. Nadella went onto say that OpenAI had led to Microsoft thinking differently about data infrastructure, particularly for the company's Azure systems. Both OpenAI and Microsoft seemed very pleased with their partnership which looks to have a very solid future, with Microsoft empowering OpenAI and taking advantage of the technology for customers within their core Azure platform.?

There were so many announcements this post could continue for longer than you would be willing to read it but let me cut to the chase of what I personally found to be a couple of the more interesting announcements.?

Function calling in OpenAI’s GPT-4 API now allows for invoking multiple functions at once with guaranteed JSON output and no added latency. This is a big deal, particularly for enterprise use cases as it potentially makes it far easier to integrate with enterprise software applications and I can see this being very useful for agent chaining (the new assistants API also makes it easier to build assistive agents and integrate them into apps).?

GPT-4 can also now take advantage of ‘code interpreter’, something that was only available to ChatGPT Plus users previously. This is a plugin that can write and run python code, potentially bringing data analysis into the realm of everyday users rather than just data scientists.?

Previously I have talked about the Retrieval Augmented Generation or the RAG pattern, which can augment LLM knowledge without the need to fine tune or train, and now Retrieval in the API enables parsing of long-form documents and extracting information from them, which can effectively make it easier for OpenAI to work with private data. This should make it a lot easier for the enterprise to integrate external document/knowledge stores with OpenAI but note that there will be some lock-in that solutions like Langchain and LlamaIndex prevent as they can not only work with OpenAI but also other Large Language Models.?

Threads is another interesting new feature as this will now enable access to conversation histories. This will allow developers to hand off thread state management to OpenAI and work around context window constraints. With the Assistants API, you simply add each new message to an existing thread.?

Also, GPTs (essentially tailored versions of chat GPT) were announced which purport to allow users to easily build customized versions of chat GPT (for specific purposes) and then publish them for others to use (Possibly in the GPT store, which was also a new announcement, and where developers can list and share their GPT creations).? ?

The rapid pace of AI progress is staggering, presenting both opportunities and challenges for enterprises. While digesting each new development is crucial, the key realization is that AI is here to stay but although adoption requires care and consideration the possibilities for enterprises are immense.

?

Mark Ward

Chief Operating Officer

1 年

outstanding

回复

Nicely done Jim!

回复

要查看或添加评论,请登录

Jim Liddle的更多文章

社区洞察

其他会员也浏览了