课程: Introduction to Large Language Models

今天就学习课程吧!

今天就开通帐号,24,700 门业界名师课程任您挑!

What are parameters

What are parameters

- [Instructor] When talking about large language models, we almost always reference the size of the model or the parameter count. GPT-3 is 175 billion parameter model. Meta's largest Llama 2 model has 70 billion parameters. But what do we mean by parameters? Now the parameters in a neural network are the variables that the model learns during the training process. They get adjusted during training because for a given input during the training process, you want to try and minimize the difference between the predicted outputs from the actual output. Let me give you an example. This is a visual representation of a neural network and you can see that the architecture has layers. So a node is represented by circles in this graphic. It receives input from other nodes, it processes it, and then passes the outputs to other nodes. So nodes represent neurons, and a collection of nodes or neurons is known as a neural network. The input…

内容