That's one small step for man, one giant leap for machine learning
"Parodia da obra renascentista "A cria??o de Ad?o" por Michelangelo (1511)"

That's one small step for man, one giant leap for machine learning

Artificial neural networks were developed to simulate the functioning of a network of biological neurons. The artificial neuron or not is a logical unit, fed by inputs that compute an output. In 1969 computer scientists Marvin Lee Minsky and Seymour Aubrey Papert improved the artificial neuron model proposed by McCulloch and Pitts in 1943. Minsky and Papert, in 1969, developed a new mathematical model, known as a perceptron (see Fig 1). 

The perceptron receives the input signals, x, and returns a neuron output y. The vector w and the bias, b, are responsible for storing the neural network information. Finally, the activation function, σ,  determines the behavior of the perceptron.

N?o foi fornecido texto alternativo para esta imagem

Fig 1. Perceptron model

So how perceptron trained? Hebb's rule largely inspired the perceptron training algorithm proposed by Rosenblatt*. The perceptron learning rule strongly resembles Stochastic Gradient Descent, where the weight updates as follows,

N?o foi fornecido texto alternativo para esta imagem

where η is the learning rate, ? target output.

*for more information please consult https://penta.ufrgs.br/edu/telelab/3/hebbian_.htm.


REFERENCES

[1] Géron, Aurélien. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. O'Reilly Media, 2019.

[2] McCulloch, Warren S., and Walter Pitts. "A logical calculus of the ideas immanent in nervous activity." The bulletin of mathematical biophysics 5.4 (1943): 115-133.

[3] Minsky, Marvin, and Seymour Papert. "An introduction to computational geometry." Cambridge tiass., HIT (1969).

要查看或添加评论,请登录

Pedro Camargos的更多文章

社区洞察

其他会员也浏览了