NLP Topic -The Language Modeling
Google

NLP Topic -The Language Modeling

The language model can predict the probability of the next word in the sequence, based on the words already observed in the sequence.

Neural network models are a preferred method for developing statistical language models because they can use a distributed representation where different words with similar meanings have similar representation and because they can use a large context of recently observed words when making predictions.

Application of language modeling in following areas:

Predictive typing ? Speech recognition ? Handwriting recognition ? Spelling/grammar correction ? Authorship identification ? Machine translation ? Summarization ? Dialogue

With this article I tried to explain language modelling technique, covering following points:

  1. Introduction to Language Modeling
  2. n-gram Language Models
  3. Sparsity Problems with n-gram Language Models
  4. Generating text with a n-gram Language Model
  5. How to build a neural Language Model?
  6. A fixed-window neural Language Model
  7. Recurrent Neural Networks (RNN)
  8. Training a RNN Language Model
  9. Backpropagation for RNNs
  10. Generating text with a RNN Language Model
  11. Evaluating Language Models
  12. How RNNs improved perplexity and Why should we care about Language Modeling?

Kindly follow the below Slides for more details description about Language Modelling. I have taken the reference from Stanford Engineering Lecture Notes of NLP and Deep Learning. I hope it will helpful students to learn.

In next article I will try to demonstrate same with Python Code,so please stay tune..

Happy Learning !!!.....Please Do like the article if find it useful.


要查看或添加评论,请登录

Nilesh Gode的更多文章

社区洞察

其他会员也浏览了