No use, artificial intelligence and machine learning are one and only thing
Reinaldo Lepsch Neto
Graduate Student and MSc candidate at PPGA-FEA-USP | Data & Analytics & Artificial Intelligence & Language Models | Proud father and caregiving husband | 50+
Just admit it. You have already tried to understand the machine learning hype, as a door to the wonderful world of Artificial Intelligence. You thought about robots, terminators, autonomous killer drones, driverless cars, highly skilled game players. And had the nightmare of your life while having ML 101, with an endless sequence of statistics lessons, equations and more equations. Seemingly all the letters of the Greek alphabet passed before your eyes. You even noticed that your distant Calculus and Linear Algebra lessons had a purpose : helping you here. Remember you learned about eigenvalues and eigenvectors, on the very last LinAlg class, that one after the final exam (the topic had to be taught, the teacher said)? Then, here and now, you have met them again. You use matrices which are not just m x n, but also m1 x m2 x m3 x … x mn. You learn they are called now tensors, and wonder where they have been all those years, as even on LinAlg 999 anyone had told you about their existence.
Well, first book, first MOOC, first class, and, whatever, done with, you say you had enough. You didn’t want to get an MSc or PhD in math or statistics, anyway. There’s a long way until the Jetsons-like future, and you don’t wanna have a math indigestion. Your kids will learn it at college, maybe. Done for a while. Back to your bizarre conversation with you Siri or Google assistant. Nevermind the math within your mobile circuitry is as crazy as their rampant answers when you make simple questions. You don’t wanna know about it.
But, wait a minute.
The whole mathematical scallion you had at college just brought you here. Don’t waste it. It will be worth going through what is apparently a nightmare.
The human brain is a mathematical machine, whether you want it or not. Okay, its actual perfect simulation is far away – or not that far, as some might say. But the way to it is paved with lots of numbers, equations, Greek letters, indexes, subscripts, superscripts, graphs, theorems, proofs… Your brain, yes, what you have inside there, works with that and can be explained this way. Maybe some extra math might have to be developed (what? Is math developed or discovered?). But the whole building which is machine learning, and its more AI-focused department, deep learning, is equivalent to part of the way already walked. When you look at a powerpoint slide completely filled with out-of-your-mind equations which describe a LSTM, or the mathematical meaning of convolutions which power computer vision. This is beauty, this is poetry. Complex structures require new models, new models require new mathematical and computing tools. The whole science as we know will benefit of it all.
So, don’t give it up. Artificial Intelligence is sexy because it tells us about machines which (not yet who) seem to think like us. And Machine Learning is sexy too, because it tells us how that happens, and how can it all be. We all know there is a long way ahead. Models, models, we need more of them. Surely lots of brilliant minds are working on lots of things that are still far too weird. Meta-languages, formal systems. The unfolding of GANs. New discoveries (or inventions?) in LinAlg and Calculus – that will be on basic books in a couple of years. Brand-new programming languages. New packages for the hype languages like Python, R, Java, C++, and others. While you read this, researchers, self-taught people, graduate and undergraduate students almost burn their neurons to understand how the neurons themselves work and how their operation can be simulated. And how this can be done in order to reach self-conscience, or what is already known as AGI – Artificial General Intelligence. But this is indeed a subject for another day.
Sr. Design Engineer, Murata Power Solutions
6 年Often, 'boring' can be very useful
Data Scientist at Cybeta
6 年Damn right!!