Deep Learning - Long Short Term Memory(LSTM)
INTRODUCTION
We know that Traditional RNNs are not good at capturing Long-range dependencies. When we are working with a huge dataset and multiple RNN layers we are at a vanishing gradient problem. To overcome these issues Long Short Term Memory is introduced. LSTM is nothing but an Artificial Recurrent Neural Network used in the field of DeepLearning. Unlike standard feedforward neural networks. LSTM has a feedback connection. It cannot only process single data points but also an entire sequence of data.
LSTM
LSTM is capable of capturing long-range dependencies and also capable of remembering RNNs weights and their inputs over a very long period of time. It can store the previous inputs for a very extended time duration. LSTM does this by using three gates:
This gating mechanism of LSTM has allowed the network to learn the condition for when to remove or ignore or keep the information in the memory cell.
LSTM Usecases
Apple is the first major tech company to integrate smart assistance in its operating system. Siri was actually a by-product of some other company, Siri was the company's adaption of a standalone app it had purchased along with creators.
领英推荐
Google implemented google voice search, compared to deep neural network LSTM RNN have additional recurrent connections and memory cell that allows them to remember the previous data.
Real-Time applications of LSTM
Name Entity Recognition: It is an NLP task that seems to locate and classify named entities mentioned in unstructured text into predefined categories.
Sentiment Analysis: It is contextual mining of text which identifies and extracts subjective information in the source material.
Machine Translation: NLP is a subfield of linguistics, in particular how to program computers to process and analyze large amounts of Natural Language data.
References:
https://www.analyticsvidhya.com/blog/2017/12/fundamentals-of-deep-learning-introduction-to-lstm/