Math-free, Parameter-free Gradient Descent in Python
Contour lines, orthogonal trajectories, and convergence paths to optimum starting from yellow point

Math-free, Parameter-free Gradient Descent in Python

I discuss techniques related to the gradient descent method in 2D. The goal is to find the minima of a target function, called the cost function. The values of the function are computed at evenly spaced locations on a grid and stored in memory. Because of this, the approach is not directly based on derivatives, and there is no calculus involved. It implicitly uses discrete derivatives, but foremost, it is a simple geometric algorithm. The learning parameter typically attached to gradient descend is explicitly specified here: it is equal to the granularity of the mesh and does not need fine-tuning. In addition to gradient descent and ascent, I also show how to build contour lines and orthogonal trajectories, with the exact same algorithm.

No alt text provided for this image
Convergence paths to optimum, starting from 100 random locations

To learn more and download the free 14 pages PDF document with Python code (with links to the GitHub source and cool videos),?follow this link

Mykel G. Larson ?

I create. I build.

2 年

What no derivatives? Haha. Well I suppose I'm curious if computationally it ends up being equivalent in terms of "hairy operations." Sometimes you can save yourself some grief doing a lot of small calcs versus trying to shove everything through one master number cruncher. Is that what's going on here?

要查看或添加评论,请登录

Vincent Granville的更多文章

社区洞察

其他会员也浏览了