The Perceptron Convergence Theorem

The Perceptron Convergence Theorem

Day 12 of the ML: Teach by Doing Project is the last part of our 3 part series on the Perceptron.

In this lecture, we learn about:

(a) Linear Separability of a dataset

(b) Margin of a point and margin of a dataset

(c) The Perceptron Convergence Theorem

How cool is it that under certain conditions, the perceptron is guaranteed to converge?

Not just that, we can also predict the maximum number of iterations in which it can converge.

I wish such awesome mathematical behaviour was displayed by modern ML algorithms.

Learn all about margins and the perceptron convergence theorem here:


Here are my lecture notes for this lecture: Link

Stay tuned for Day 13!

要查看或添加评论,请登录

Vizuara的更多文章

社区洞察

其他会员也浏览了