课程: Introduction to Large Language Models
今天就学习课程吧!
今天就开通帐号,24,700 门业界名师课程任您挑!
Open LLMs
- [Narrator] We've looked at models that have all come from big tech firms like Google and OpenAI. But what about open community models? Now, although OpenAI made GPT-3 available via an API or the Playground, which is what we've seen so far, no access was given to the actual weight of the model. So if you had access to the weight of the model, you can tweak and experiment with the model to create new variations that might be better suited for a specific task. For example, you can try and create smaller versions of the model. Meta released OPT or Open Pre-trained Transformers. This is a couple of decode-only pre-trained transformers ranging from 125 million to 66 billion parameters which they shared with anyone. And interested researchers could also apply for access to the 175 billion parameter models. This effectively gave researchers access to a large language model that was very similar to GPT-3. The Meta team also…
内容
-
-
-
-
-
(已锁定)
BERT3 分钟 16 秒
-
(已锁定)
Scaling laws3 分钟 30 秒
-
(已锁定)
GPT-37 分钟 41 秒
-
(已锁定)
Chinchilla7 分钟 54 秒
-
(已锁定)
PaLM and PaLM 23 分钟 59 秒
-
(已锁定)
ChatGPT and GPT-45 分钟 47 秒
-
(已锁定)
Open LLMs5 分钟 40 秒
-
(已锁定)
Comparing LLMs3 分钟 35 秒
-
(已锁定)
GitHub Models: Comparing LLMs2 分钟 52 秒
-
(已锁定)
Accessing large language models using an API6 分钟 25 秒
-
(已锁定)
LLM trends4 分钟 6 秒
-
(已锁定)
-