NLP Topic -The Language Modeling
The language model can predict the probability of the next word in the sequence, based on the words already observed in the sequence.
Neural network models are a preferred method for developing statistical language models because they can use a distributed representation where different words with similar meanings have similar representation and because they can use a large context of recently observed words when making predictions.
Application of language modeling in following areas:
Predictive typing ? Speech recognition ? Handwriting recognition ? Spelling/grammar correction ? Authorship identification ? Machine translation ? Summarization ? Dialogue
With this article I tried to explain language modelling technique, covering following points:
- Introduction to Language Modeling
- n-gram Language Models
- Sparsity Problems with n-gram Language Models
- Generating text with a n-gram Language Model
- How to build a neural Language Model?
- A fixed-window neural Language Model
- Recurrent Neural Networks (RNN)
- Training a RNN Language Model
- Backpropagation for RNNs
- Generating text with a RNN Language Model
- Evaluating Language Models
- How RNNs improved perplexity and Why should we care about Language Modeling?
Kindly follow the below Slides for more details description about Language Modelling. I have taken the reference from Stanford Engineering Lecture Notes of NLP and Deep Learning. I hope it will helpful students to learn.
In next article I will try to demonstrate same with Python Code,so please stay tune..
Happy Learning !!!.....Please Do like the article if find it useful.