How AI Works

How AI Works

AI For Procurement Series - Part 1??

Learn the basics of ChatGPT, LLM's and Prompt's. We'll accomplish this by explaining the meaning of common terms you may have heard and how they all fit together.

Artificial Intelligence - This is now a catch-all phrase that could refer to anything from machine learning to human level intelligence. For our purposes we will focus on AI as applied to ChatGPT.

ChatGPT - An interface to a large language model (LLM). Released to the public in November 2022 by the company OpenAI, ChatGPT exploded in popularity in the first quarter of 2023. ChatGPT itself is simply an interface, a way of communicating with a LLM. Importantly, ChatGPT itself is not the LLM, or any sort of 'AI', it is simply a user interface.

Large Language Model (LLM) - The LLM is the actual model trained to interact with human language at large scale. There are many types of LLM's, but the most well known is GPT? from OpenAI. There is also BERT (Google), LLaMA (Facebook), and many others but this article will focus on GPT. LLM's are 'trained' on large corpuses of data comprising millions and even billions of data points. It is this large body of information that gives LLM's the ability to seemingly reason and converse in natural language. Training at large scale is time consuming, costly, and most importantly is a one time event. This means that a LLM only 'knows' information up to its training date (September 2021 for GPT3.5, and December 2023 for GPT4). This is called the 'knowledge cutoff date'.

GPT - Generative Pretrained Transformers are LLM's that generate text by predicting the next word in a sequence. The two most popular GPT's are GPT3.5 and GPT4 from OpenAI. Currently, GPT3.5 powers the free version of ChatGPT, and the paid version allows access to both GPT3.5 and GPT4. While GPT3.5 is good for basic tasks like lookup or summarization, any complex reasoning requires the much more capable GPT4.?

Before introducing more terms, let's visualize what we've covered so far:


So we have the ChatGPT interface communicating with the GPT3.5 or GPT4 LLMs. But how does it communicate? Let's introduce some new terms.

Prompt - Text passed to the LLM. The prompt is commonly thought of as the question entered into ChatGPT. But actually the prompt has 3 main components: System Instruction, Query, and Context. These components are combined together to create the prompt.

System Instruction - Instructions to be followed by the LLM. Think of these as global instructions that are included with every prompt. System Instructions are created by the developer of the interface, so ChatGPT has System Instructions that are unseen by the typical user (there are ways to discover the ChatGPT System Instructions). OpenAI also allows paid users to create Custom Instructions, which essentially act as additional System Instructions that are applied to every prompt.

Query - This is the actual input composed by the user. Most queries are simple, but they can become very complex for advanced users. The structure and language used in the query is crucial to the response received from the LLM, the art of crafting queries is commonly called 'prompt engineering', but this term can also have more complex meanings.?

Context - The information LLM's can answer questions about has two main constraints; time and proprietary content. Time is determined by the knowledge cutoff date discussed above. The really important constraint is proprietary content. For example, there is no way for LLM's to know information that is behind company firewalls. For procurement purposes, LLM's do not know current price and delivery, and they know some component data but not enough to be consistently useful, and they don't know your IPN's or ERP information. But it is possible to provide the LLM with current or proprietary data in the prompt. This is called 'context'.

Once again let's review what we've covered:


ChatGPT combines it's System Instructions with the User Query and optionally any Context data to create the Prompt, which it then sends to the LLM. The answer is called the 'Response'.

Response - The return from an LLM in response to a prompt. This may sound simple, but is actually one of the most powerful aspects of LLMs because the type and format of the response is highly customizable. Responses can be plain text, pdf's, images, tables, data visualizations (graphs, charts, etc.), json, xml, and more. Plain text can be formatted including bullets and lists, tables can be sorted and columns labeled any way imaginable, data visualizations can be customized any number of ways, and images is a whole deep subject all it's own.?

To summarize, AI works by using an interface (like ChatGPT) to construct a prompt (system instructions + query + context) that is sent to a LLM (GPT3.5 or GPT4) which returns a response (multi-format).?

To learn more and see how AI works for procurement visit SnapChip.ai.



SnapChip Homepage


要查看或添加评论,请登录

Everett Frank的更多文章

  • How To Use The Octopart API

    How To Use The Octopart API

    This article will show you how to use the Octopart API using Python. Octopart is a product of Nexar Cloud Platform.

  • How EMS Can Drive Profit With AI

    How EMS Can Drive Profit With AI

    How can an EMS company increase profits in the next 12 months? New customers? Perhaps, but unlikely. Consider: On…

  • Here's What AI Can Actually Do For Sourcing Today.

    Here's What AI Can Actually Do For Sourcing Today.

    After a year spent deep in the weeds understanding and programming AI for electronic component sourcing, I’ve recently…

    2 条评论
  • If components could talk, what would you ask?

    If components could talk, what would you ask?

    Imagine walking into a room filled with all the electronic components you need to buy this week. Now, imagine if each…

    1 条评论
  • How To Prompt ChatGPT

    How To Prompt ChatGPT

    AI For Procurement Series - Part 2 Learn how to improve your results with ChatGPT using 5 best practices. In this…

    1 条评论
  • What is CPH?

    What is CPH?

    Component placements per hour (CPH) is fundamental to understanding the capabilities of an SMT line. If you want to…

  • 3 Easy Steps To Reshoring PCBA's

    3 Easy Steps To Reshoring PCBA's

    Here's an instant and easy way to see if your printed circuit board assemblies (PCBA's) can be made in the USA at…

  • What is Should Cost Analysis?

    What is Should Cost Analysis?

    Should cost analysis ("should costing") was developed by the Defense Department to assist procurement officers in…

  • Should I Tell EMS Suppliers What Others Have Quoted?

    Should I Tell EMS Suppliers What Others Have Quoted?

    If they deserve it, yes. If you buy only on the basis of price, then no.

  • Understanding EMS Price Variances

    Understanding EMS Price Variances

    How much should you expect prices to vary between EMS companies? The answer is about 5 percent. EMS companies use a…

社区洞察

其他会员也浏览了