AI2 Incubator Technology Newsletter - February 2022
Welcome to the February, 2022 edition of the AI2 incubator newsletter. In this edition, we cover updates on incubator companies and recent news on large self-supervised models from Microsoft, Google, Meta, AI2, and OpenAI.
What's new with AI2 incubator companies
We started the new year with a trio of companies graduating from AI2 incubator, raising a total of $11M in seed round funding. All three were covered by Geekwire.
Large models: new and noteworthy
GPT-3's release in the summer of 2020 was a watershed moment for the AI/ML community. One of the key finding is that size matters, a lot. Training large models on lots of data requires huge capex investment. Last month, Meta announced the on-going building of its AI super computer: Research SuperCluster (RSC). It currently boasts 760 Nvidia DGX A100 systems, growing to a total of 2,000 by the end of the year. That's 16,000 A100 GPUs!
We are still in the early innings of developing large self-supervised models. Research on LSMs falls into the following two broad categories: 1) methods that help achieve the same or better performance with smaller models and/or training/inference cost and 2) get LSMs to do new things: fine-tune, follow instructions, learn from audio/images, search the Web for info, generate code, etc.
Do more with less
Research in this category has the flavor of achieving better performance than GPT-3 using a smaller model.
领英推荐
New capabilities
OpenAI continues to be THE pioneer in this category. We can now fine-tune GPT-3 with your our data, instruct it with a technique called reinforcement learning from human feedback (RLHF), command it to surf the Web to find more accurate answers, even train it to solve math problems from the IMO (international math olympiads). If all you want are just embeddings, OpenAI has you covered as well.
Others
There were several 2021 retrospectives that are interesting to us:
Export manager @ STRAPA JSC
2 年Vu, thanks for sharing!