Online Learning (by Machines)
Sachin Tendulkar's Form: Moving average of batting performance

Online Learning (by Machines)

Most ML methods are applied in a batch process mode. So for example in a simple ML model let’s say we want to learn frequencies of words (for may be word prediction), then a list of corpus is made and frequencies are counted based on that.

This is not computationally efficient as we will have to run the whole learning model for even incremental updates of data. Alternatively we can look at model where we store count for each word and total word count separately. So when a new word comes to the machine:

This is a simplistic example, but this can be extended to any type of ML algo including:

1.      Exponentially weighted moving averages (EWMA)

2.      Gradient descents

3.      Deep learning etc..

The obvious advantage is the computational efficiency. But more than that if you incorporate a memory window, for example like in an exponentially moving weighted moving average (EWMA) we can see much more interesting outcomes. The word counting case in EWMA, the online learning algorithm becomes:

Here α is the “memory” parameter. It lies between 0 and 1. If we take high α, the model has very short term memory. If we take low α, the model has very long term memory. This kind of behaviour is new to online learning and may not be present in regular batch processing algorithms. An application of this model in deep learning space is the famous LSTM (long-short-term-memory) networks. Another very popular application of online learning is Kalman Filter.

要查看或添加评论,请登录

Gopi Krishna Suvanam的更多文章

  • The Hidden Risks of Over-investment in AI Infrastructure

    The Hidden Risks of Over-investment in AI Infrastructure

    The $500 billion investment in AI Infra definitely looks exciting. But.

    2 条评论
  • The Good, the Bad and the Ugly of ChatGPT

    The Good, the Bad and the Ugly of ChatGPT

    To the uninitiated, ChatGPT is the viral product of the organization OpenAI. It is an AI base chat-bot that interacts…

    2 条评论
  • Transfer Learning: The Future of AI

    Transfer Learning: The Future of AI

    End to end learning is paradigm in machine learning where an algorithm takes inputs and desired actions, to learn a…

    2 条评论
  • Limitations of End-to-End Learning

    Limitations of End-to-End Learning

    (Originally posted as an answer to a Quora question here: https://www.quora.

    1 条评论
  • Value per person is more important

    Value per person is more important

    There has been a mad rush to become the next big thing from eCommerce to social media to SaaS. In this rush, startups…

    1 条评论
  • Neat Learning vs Deep Learning

    Neat Learning vs Deep Learning

    AI world used to have two camps: neats and scruffies. The neats wanted AI solutions to be, well, 'neat'.

    1 条评论
  • Primitives for AI

    Primitives for AI

    I want to explore more structural thoughts in this article. Let’s say I want to build a financial market prediction…

    2 条评论
  • Machine Learning on Small Datasets

    Machine Learning on Small Datasets

    You may be wondering what the image above the title has to do with machine learning. We will get there by the end of…

    4 条评论
  • Can financial markets be predicted? - Brief note for beginners

    Can financial markets be predicted? - Brief note for beginners

    There are lot of concepts at play here. Let me highlight a few.

  • Machine Learning on External Memory

    Machine Learning on External Memory

    Once Python fails to load data or R is not able to perform analysis because of running out of memory, many people think…

社区洞察

其他会员也浏览了