How I started with Deep Learning?
Deep Learning (Pic: Google)

How I started with Deep Learning?

Note: In this post, I talk about my learning in deep learning, the courses I took to understand, and the widely used modules in Python for coding. This is also published on blog.

Starting 2012, Deep Learning has been a hot topic in the data science community growing day by day. A quick search on Google trends confirms this.

In the chart above, the deep learning search has been going upwards since 2012 (Machine Learning is much more common). One of my resolutions this year was to familiarize myself with neural networks and deep neural networks (deep learning) and the recent advances. I had tried to do the same last year as well but the lack of good tutorials/ lectures/ MOOCs slowed my process. I specifically focused learning the following:

Fully Connected Deep Neural Network: This can be applied to problems where the size of the input matrix is fixed. Although, really popular, one of the limitations of this network is having too many variables to optimize which requires heavy computation power. One advantage of this is, there are many out-of-box modules such as tensorflow and keras that implement it and can be applied for wide range of problems such as image classification, sentiment prediction, translation etc with varying accuracy.

Convolutional Neural Network (CNNs): These neural networks have a specific application - Image classification. In many terms, its very similar to the neural network above, however it is a type of feed-forward artificial neural network in which the connectivity pattern between its neurons is inspired by the organization of the animal visual cortex.

Recurrent Neural Network (RNN), Long Short Term Memory (LSTMs), Gated Recurrent Units (GRUs): In all the neural networks above, the data flowed in a forward direction, however that is not how brain works (on which the neural networks are loosely built). A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. This creates an internal state of the network which allows it to exhibit dynamic temporal behavior. This way, information can be forgotten, kept, and passed on to the next set of network. This type of network in mostly used in Natural language tasks or speech.

The purpose of this post is to give the list of resources I used to learn.

Here are the resources to understand the concepts:

Geoffrey Hinton's Neural Networks for Machine Learning on Coursera: This was one of the first freely available good courses on the Internet that I laid my hands on. It's a little long but really helpful in understanding the concepts of neural networks such as layers, back propagation, gradient calculation, vanishing gradient etc.

Stanford's CS231n - Convolutional Neural Networks for Visual Recognition: The instructors are Andrej Karpathy and Justin Johnson (I believe they are doing/ already did their PhD under Professor Fei-Fei Li). This playlist of 15 videos is simply awesome, you'll like all of it, and it ends with Jeff Dean (of Google) giving the last lecture. The University, the Professor, and the instructors are kind enough to share the lectures with the outside world and it helped me to understand the basics of CNN, RNNs and LSTMs. They are all 1 hour video and you can solve the problem statement.

Nando de Freitas Deep Learning Lecture at Oxford University: Professor Freitas's lectures has also been really helpful for me in understanding the deep learning concepts.

Adit Deshpande's blog: Adit is a CS undergrad at UCLA and I found his post on CNNs really useful and detailed. Here is the link.

Apart from these series of lectures, one can quickly understand the basics on CNNs and RNNs (provided the basics of neural network is clear) by watching these videos.

Deep Learning Modules in Python

I mostly use Keras and Tensorflow for all the work related to deep learning.

Tensorflow: It is an open-source library by Google mainly for machine/ deep learning applications. It is the most widely used library among the possible options in Python and has a really good documentation.

Keras: It is yet another widely used library for high-level neural networks API, written in Python and capable of running on top of either TensorFlow or Theano. Resources for learning Keras could be found here.

If there are other good resources, please feel free to share them in comments.

Danish C Thottathil

AI Solutions Lead, Healthcare and Life Sciences at Quantiphi

7 年

Colah's blog: colah.github.io

回复

要查看或添加评论,请登录

Ravi Shankar的更多文章

  • Measuring Text Similarity in Python

    Measuring Text Similarity in Python

    Note: This article has been taken from a post on my blog. A while ago, I shared a paper on LinkedIn that talked about…

    1 条评论
  • Getting started with Apache Spark

    Getting started with Apache Spark

    If you are in the big data space, you must have head of these two Apache Projects – Hadoop & Spark. To read more on…

  • Intuitive Explanation of "MapReduce"

    Intuitive Explanation of "MapReduce"

    How many unique words are there in this sentence which you are reading? The answer which you will say is 12 (Note: word…

  • Getting started with Hadoop

    Getting started with Hadoop

    Note: This is a long post. It talks about big data as a concept, what is Apache Hadoop, "Hello World" program of Hadoop…

    7 条评论
  • What is the Most Complex thing in the Universe?

    What is the Most Complex thing in the Universe?

    What is the most complex piece of creation (natural/artificial) in this universe? Is it the human brain? But if the…

    11 条评论
  • Automate Finding Items on Craigslist || Python & Selenium to the Rescue

    Automate Finding Items on Craigslist || Python & Selenium to the Rescue

    If necessity is the mother of invention, then laziness is sometimes its father! Craigslist, especially in the United…

    7 条评论
  • Getting Started with Python!

    Getting Started with Python!

    Note: This post is only for Python beginners. If you are comfortable with it, there might be nothing new to learn.

    2 条评论
  • L1, L2 Regularization – Why needed/What it does/How it helps?

    L1, L2 Regularization – Why needed/What it does/How it helps?

    Simple is better! That’s the whole notion behind regularization. I recently wrote about Linear Regression and Bias…

    4 条评论
  • Bias-Variance Tradeoff: What is it and why is it important?

    Bias-Variance Tradeoff: What is it and why is it important?

    What is Bias- Variance Tradeoff? The bias-variance tradeoff is an important aspect of machine/statistical learning. All…

    7 条评论
  • Understanding Linear Regression

    Understanding Linear Regression

    In my recent post on my blog, I tried to present my understanding of linear regression with charts and tables. Here's…

社区洞察

其他会员也浏览了