课程: Introduction to Large Language Models
今天就学习课程吧!
今天就开通帐号,24,700 门业界名师课程任您挑!
LLM trends
- [Instructor] Over this course, we've looked at scaling laws and lessons from the Chinchilla models. Let's bring this all together as we look at the current trends for large language models. When training of large language models first kicked off, most of the focus was on improving the models, and then the scaling laws were the guiding principle. And the scaling laws suggested that you would get the biggest improvement by increasing the size of the models. So you do this by increasing the model's architecture, so that's the number of layers in a model, the number of attention heads, and so on. But that was only one dimension, because as model providers created large models, the associated training cost became an important consideration. So the second important dimension became training cost, because being able to train a model effectively, given a budget, was important. And the learnings from the Chinchilla paper earlier…
内容
-
-
-
-
-
(已锁定)
BERT3 分钟 16 秒
-
(已锁定)
Scaling laws3 分钟 30 秒
-
(已锁定)
GPT-37 分钟 41 秒
-
(已锁定)
Chinchilla7 分钟 54 秒
-
(已锁定)
PaLM and PaLM 23 分钟 59 秒
-
(已锁定)
ChatGPT and GPT-45 分钟 47 秒
-
(已锁定)
Open LLMs5 分钟 40 秒
-
(已锁定)
Comparing LLMs3 分钟 35 秒
-
(已锁定)
GitHub Models: Comparing LLMs2 分钟 52 秒
-
(已锁定)
Accessing large language models using an API6 分钟 25 秒
-
(已锁定)
LLM trends4 分钟 6 秒
-
(已锁定)
-