Do-BERT
(Source: Pixabay)

Do-BERT

BERT (Bidirectional Encoder Representations from Transformers) has taken the world of NLP (Natural Language Processing) by storm.

Language-text is essentially a sequence of words. So, traditional methods like RNNs (Recurrent Neural Networks) and LSTMs (Long Short Term Memory) used to be ubiquitous in Language Modeling (predicting next word. Remember, typing SMS?). But they would not remember previous words a bit far away. Then came 'Attention is All you need' and its architecture called, `Transformer'.

BERT is a Transformer-based machine learning technique for NLP pre-training developed by in 2018 by Jacob Devlin and his colleagues from Google.

Following sketchnote gives overview of BERT

No alt text provided for this image

References

  • "Transformer: A Novel Neural Network Architecture for Language Understanding" - Google AI Blog
  • "A Visual Guide to Using BERT for the First Time" - Jay Alammar
  • "The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)" - Jay Alammar
  • "The Illustrated Transformer" - Jay Alammar
  • "Explaining BERT Simply Using Sketches" - Rahul Agarwal
  • "Attention Is All You Need" - Ashish Vaswani et al.

Shobhit Chugh

I help Ambitious Product Managers transform into Highly Respected Product Leaders making 50%+ Income in a Dream Role | Free Lesson -> ipmworkshop.com

2 年

I prefer ERNIE. I am also clearly not in NLP

回复

Short but sweet. Thank you for the input Yogesh Kulkarni

Nishikant Gurav

Python | Data Science | Machine Learning | Artificial Intelligence | Generative AI | NLP | Lets connect

2 年

Yogesh Kulkarni awesome one I came across in recent days... Short and easy to understand..??

Sachin Gupta

Doctoral Generative AI Researcher |AI leader |Hands on Guy |AI Architect|AI Product Manager| Generative AI Expert | AI Product Strategy maker|M.Tech in DataScience from BITS

2 年

Yogesh Kulkarni love your hand draw sketchnote way.. of explaining #1pagertech

sainath pawar

Sr. Data Scientist at Globant | IIM Kozhikode |Generative AI | Youtuber

2 年

Short and sweet. Do-BERT ??

要查看或添加评论,请登录

Yogesh Haribhau Kulkarni的更多文章

  • Intro to Neo4j

    Intro to Neo4j

    Graphs are inherently present in many domains such as logistics, social networks, etc. Graphs are nothing but nodes and…

    7 条评论
  • ????? ???? ???? ???? (Might is Right)

    ????? ???? ???? ???? (Might is Right)

    Imagine a group of friends is deciding about which movie to watch, a horror or a comedy!!. Usual way is by majority…

    1 条评论
  • I believe I can fly

    I believe I can fly

    Humans always wanted to fly. Like birds.

    7 条评论
  • Cred-ibility

    Cred-ibility

    You would expect a successful Indian entrepreneur to be from IITs, IIMs or if not that at least from STEM (Science…

    4 条评论
  • Mathematics Can Be Fun

    Mathematics Can Be Fun

    My childhood days, apart from playing cricket and doing paintings were also filled with reading wonderful books from a…

    3 条评论
  • AI, generally speaking ...

    AI, generally speaking ...

    Artificial Intelligence (AI) covers wide range of technologies, right from ruled based expert systems to latest…

    5 条评论
  • I'm Feeling Lucky

    I'm Feeling Lucky

    You don't have to be lucky anymore if you wish to have a precise search result. We all are familiar with keyword based…

    3 条评论
  • Transformation by Hugging Face

    Transformation by Hugging Face

    Are you lost in the storm of these BERTs ie ALBERT, DistilBERT, RoBERTa etc? And these GPTs (1-2-3)? Don't understand…

    3 条评论
  • Reversing the Interview

    Reversing the Interview

    One of the popular topics in the interviews for Software Engineering roles, is 'Data Structures and Algorithms' (DSA)…

    17 条评论
  • We are the world, we are the sensors.

    We are the world, we are the sensors.

    Typical Artificial Intelligence (AI) approaches model the data they are fed with. Data is what we can measure and store.

社区洞察

其他会员也浏览了