课程: Introduction to Large Language Models
今天就学习课程吧!
今天就开通帐号,24,700 门业界名师课程任您挑!
What are context lengths?
- [Instructor] When having a conversation with a large language model, how can you figure out how much of the conversation it remembers? That's what context windows are all about. Now, if you remember, a prompt is the text you input into the model and it's made up of a couple of tokens. The completion is the text outputted from the model, which also makes up a couple of tokens. The sum of the tokens of the prompt and the completions is known as the "context window" or "context length". Now, the longer the context length, the more informational background the model has for generating a response. For a language model to produce a more meaningful and relevant response, it needs to be able to take an entire conversation into consideration. Now, different large language models will have different context lengths. Chat GPT currently has a context length of around 4,000 tokens. GPT-4 has 8,000 tokens. GPT-4-32k has 32,000…