GenAI Weekly — Edition 6

GenAI Weekly — Edition 6

Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs

Stay at the forefront of the Gen AI revolution with Gen AI Weekly! Each week, we curate the most noteworthy news, insights, and breakthroughs in the field, equipping you with the knowledge you need to stay ahead of the curve.

? Click subscribe to be notified of future editions


Unstract just released as open source!

One of the most common uses of LLMs is to go beyond what traditional RPA or IDP can do when it comes to structuring unstructured documents. However, there are a lot of challenges in getting this done right from the extraction of text data from PDFs, scanned images, or other formats, to prompt engineering, evaluation, and integration with existing systems.

This very specific use case is where Unstract can help teams move fast, leveraging LLMs. By doing the heavy lifting in this fast-changing ecosystem it lets engineers concentrate on implementing core business workflow automation. Github repo: https://github.com/Zipstack/unstract

Here's what's special about Unstract:

- Unstract is available under the AGPL, which is friendly for personal or commercial use.

- LLMWhisperer: Results from LLMs are as good as the input they receive. LLMWhisperer is a cloud service that given PDFs (native text, scanned) returns text data in a way LLMs are best able to "understand".

- Unstract supports a variety of providers for LLMs, Vector Databases, Embeddings, Cloud File Storage systems, and databases/data warehouses.

- Unstract's featured Prompt Studio, a purpose-built, no-code environment that makes it easy for you to develop, run, and debug prompts while referring to sample documents side-by-side.

- Unstract supports the launching of APIs and ETL Pipelines that take in unstructured documents and produce JSON data or sync data with various data sources like Snowflake, Bigquery, Redshift, PostgreSQL, etc.

Get started now:

See our documentation to walk through a quick start on a project structuring standard credit card statements from a couple of different banks.

Or watch Unstract in action with this 3-minute video: turning unstructured data into structured JSON.

Enterprise-only features: Unstract has a couple of enterprise/cloud/managed features that are not part of the open-source offering: LLMEval uses more than one LLM to arrive at a consensus on extracted fields. If there is no consensus, the field is set to null (a null value is always better than a wrong value), after which a human can review the extraction results.

There are a couple of advanced Prompt Studio features that can cut token usage (and thus costs) by up to 6x (Single Pass and Summary-based Extraction). Otherwise, all other features are available in our open-source version.


Databricks introduces the DBRX LLM

From the Databricks blog:

Databricks

Today, we are excited to introduce DBRX, an open, general-purpose LLM created by Databricks. Across a range of standard benchmarks, DBRX sets a new state-of-the-art for established open LLMs. Moreover, it provides the open community and enterprises building their own LLMs with capabilities that were previously limited to closed model APIs; according to our measurements, it surpasses GPT-3.5, and it is competitive with Gemini 1.0 Pro. It is an especially capable code model, surpassing specialized models like CodeLLaMA-70B on programming, in addition to its strength as a general-purpose LLM.

What is DBRX?

DBRX is a transformer-based decoder-only large language model (LLM) that was trained using next-token prediction. It uses a fine-grained mixture-of-experts (MoE) architecture with 132B total parameters of which 36B parameters are active on any input. It was pre-trained on 12T tokens of text and code data. Compared to other open MoE models like Mixtral and Grok-1, DBRX is fine-grained, meaning it uses a larger number of smaller experts. DBRX has 16 experts and chooses 4, while Mixtral and Grok-1 have 8 experts and choose 2. This provides 65x more possible combinations of experts and we found that this improves model quality. DBRX uses rotary position encodings (RoPE), gated linear units (GLU), and grouped query attention (GQA). It uses the GPT-4 tokenizer as provided in the tiktoken repository. We made these choices based on exhaustive evaluation and scaling experiments.

How it was built:

DBRX was trained on 3072 NVIDIA H100s connected by 3.2Tbps Infiniband. The main process of building DBRX - including pretraining, post-training, evaluation, red-teaming, and refining - took place over the course of three months.

“The king is dead”—Claude 3 surpasses GPT-4 on Chatbot Arena for the first time

Benj Edwards writing for ArsTechnica:

Benj Edwards Ars Technica

On Tuesday, Anthropic's Claude 3 Opus large language model (LLM) surpassed OpenAI's GPT-4 (which powers ChatGPT) for the first time on Chatbot Arena, a popular crowdsourced leaderboard used by AI researchers to gauge the relative capabilities of AI language models. "The king is dead," tweeted software developer Nick Dobos in a post comparing GPT-4 Turbo and Claude 3 Opus that has been making the rounds on social media. "RIP GPT-4."

While we all wanted both performance and diversity, it was always elusive. Until now.


Apple could bring Google Gemini to the iPhone for AI

Ryan Morrison writing for Tom’s Guide:

Ryan Morrison

Apple is reportedly in talks with Google to bring its Gemini artificial intelligence models to the iPhone and other devices. This would likely build on the existing multi-billion dollar search deal, with Google’s AI running alongside Apple’s own models.

One has to take into account that this is still a lot of speculation and this can head in any direction.


AI21 Labs Unveils Jamba: The First Production-Grade Mamba-Based AI Model

Chris McKay writing for Maginative:

Chris McKay Maginative

AI21 Labs, has just released Jamba, the world's first production-grade AI model based on the innovative Mamba architecture. Most models today (like GPT, Gemini and Llama) are based on the Transformer architecture. Jamba combines the strengths of both the Mamba Structured State Space model (SSM) and the traditional Transformer architecture, delivering impressive performance and efficiency gains.

OpenAI Heading To Hollywood To Pitch Revolutionary “Sora”

Bruce Haring writing for Deadline:

“Being told that it can do all of these things is one thing, but actually seeing the capabilities, it was mind-blowing,” he said in an earlier interview. While the businessman in him sees the opportunity, he also expressed worries about the people who work in the business. “There’s got to be some sort of regulations in order to protect us. If not, I just don’t see how we survive.”

This was the logical next step of course.


Strategy In The Era Of AI

Zoe Scaman writing for her newsletter:

Zoe Scaman

I aim to master this field, not be outpaced by it.
Through my exploration of AI, its influence on my work and my approach has been profound; it has not just streamlined my efficiency but expanded my thinking, evolving my (ever-changing) methods and endowing me with superpowers of sorts; allowing me to delve deeper, forge unexpected links, bring my ideas to life, and much more. Put more succinctly, it’s the most powerful catalyst for creativity I've encountered, and that's no overstatement. So, I decided to compile a practical mini guide to share how I'm navigating and leveraging AI, in case it might be helpful to others.

A lot of productivity tips all in one place.


How deep is Nvidia’s Software Moat?

From the Digits to Dollars blog:

Put simply the defensibility of Nvidia’s position right now rests on the inherent inertia of software ecosystems. Companies invest in software – writing the code, testing it, optimizing it, educating their workforce on its use, etc. – and once that investment is made they are going to be deeply reluctant to switch. We saw this with the Arm ecosystem’s attempt to move into the data center over the last ten years. Even as Arm-based chips started to demonstrate real power and performance advantages over x86, it still took years for the software companies and their customers to move, a transition that is still underway. Nvidia appears to be in early days of building up exactly that form of software advantage. And if they can achieve it across a wide swathe of enterprises, they are likely to hold onto for many years. This more than anything else is what positions Nvidia best for the future.

Betting the future on AI was one thing, but for a hardware company to do it with software was something else. But, how deep is that moat really?


Can Demis Hassabis Save Google?

Alex Kantrowitz writing for Big Technology:

Alex Kantrowitz

Go is a board game with more playable combinations than atoms in the universe, an “Everest” of AI as Hassabis calls it. In March 2016, DeepMind’s AlphaGo — a program that combined reinforcement learning and deep learning (another AI method) — beat Go grandmaster Lee Sedol, four games to one, over seven days. It was a watershed moment for AI, showing that with enough computing power and the right algorithm, an AI could learn, get a feel for its environment, plan, reason, and even be creative. To those involved, the win made achieving artificial general intelligence?— AI on par with human intelligence — feel tangible for the first time.

If you ignore the click baity title, this is a good interview of someone very smart and influential.


For the extra curious

要查看或添加评论,请登录

Shuveb Hussain的更多文章

  • GenAI Weekly — Edition 37

    GenAI Weekly — Edition 37

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

  • GenAI Weekly — Edition 36

    GenAI Weekly — Edition 36

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

  • GenAI Weekly — Edition 35

    GenAI Weekly — Edition 35

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

    3 条评论
  • GenAI Weekly — Edition 34

    GenAI Weekly — Edition 34

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

  • GenAI Weekly — Edition 33

    GenAI Weekly — Edition 33

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

  • GenAI Weekly — Edition 32

    GenAI Weekly — Edition 32

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

    2 条评论
  • GenAI Weekly — Edition 31

    GenAI Weekly — Edition 31

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

    1 条评论
  • GenAI Weekly — Edition 30

    GenAI Weekly — Edition 30

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

    1 条评论
  • GenAI Weekly — Edition 29

    GenAI Weekly — Edition 29

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

  • GenAI Weekly — Edition 28

    GenAI Weekly — Edition 28

    Your Weekly Dose of Gen AI: News, Trends, and Breakthroughs Stay at the forefront of the Gen AI revolution with Gen AI…

社区洞察

其他会员也浏览了