AI - The software evolution?
Ian Ferreira
x-AMD, x-Microsoft, x-CoreScientific, Dad, Perpetual Student (Hiring ML & Software Engineers, TPMs, UX)
When I was a teenager, I dreamt of working at the intersection of medicine and engineering, a field called Bio-medical engineering. Inspired by the “Six Million Dollar Manâ€?I was captivated by the potential of technology to supplement life, making us “stronger, faster and betterâ€.?
After all, who isn’t moved by images of a baby that hears her mother’s voice for the first time or the paralyzed man that “stood tallâ€, both enabled with the help of technology.
That said having worked in Ad tech for almost two decades, I took a slight detour. In hindsight however this did set in place the background needed to work with highly algorithmic & data driven distributed systems. Arguably the first major commercial application of machine learning was Click and Fraud prediction. Having spent a couple of months speaking to companies in the AI space, I have decided to start on ground floor, so to speak.?
Joining Core Scientific would allow building a “made for AI†cloud, that starts from the hardware level up. In contrast to general software workloads that have been well served by Moore's law and the reduction of storage cost, machine learning and AI is still very much computationally and power consumption bound. To really be able to unlock the potential of AI, you need to start from the ground up.
So back to that teenage dream. I believe that AI is in fact the very essence of using technology to make our lives better, but with the added twist that much of what constitutes AI and specifically deep learning,?has its roots traced back to life itself.
Having read?Kurzweil's "How to create a mind", Tegmark's "Life 3.0", Lane’s “The Vital Question†and "The Gene" by Siddhartha almost in chronological order, it is remarkable how this all ties together.
Evolving through Natural Selection
“I have called this principle, by which each slight variation, if useful, is preserved, by the term of Natural Selection.†– Charles Darwin
Nick Lane’s “The Vital Question†covers the miraculous process of how life was created through?billions of epochs of trial and error experiments with an overarching goal to sustain life.
What struck me about this book was how mechanical the processes at the cellular level were. They weren’t necessarily mysterious, they were just miniature. Take the ATP synthase process, this molecule looks just like the impeller on a water pump, but is used as a H+ proton pump. The mitochondria, batteries of our cells, operate very similar to batteries today creating gradients that encourage electrons to move in a particular direction. ATP pumps are used to create these proton gradients by pumping protons through cell walls.?
As Siddhartha in “The Gene†so eloquently articulates, the ATCG codes punched on certain parts of the DNA is nothing more than “computer code†that enzymes would read and use to encode proteins, moving up and down the sequence just like a turing machine.
Inâ€Life 3.0â€, Tegmark does a great job articulating the different ways that this information has been passed through the various stages of evolution. Starting with biological transference (hereditary gene encoding), followed by language, social transference (books, storytelling) to the digital transference (bits and bytes). Each stage of the knowledge transfer process being broader and faster.
For example, with biological transference, the act of transferring a bit of information from one generation to the next took years of evolution and was limited only to linage-based transfer. The information stayed within the family, so to speak. From a data point scale point of view the amount of encoding in DNA is limited to around six Gigabits or ~one Gb.
With the immersion of language, the information sharing accelerated with us being able to capture and share information across linage and at a much faster rate. One account tallied the number of published books at 130 Million.
Once we entered the digital age, this knowledge sharing really exploded. Using the size of a search engine as proxy of the amount of digitized information, Eric Schmidt?in 2004 estimated the corpus at five Billion Gigabytes. That is five billion times more than DNA.
Today, not only is information being shared at new record speed, researchers and data scientist share pre-trained models.?In a process called transfer learning one can share trained models and reuse them as part of new network architectures. That would be the biological equivalent of transferring parts of ones brain. The ONNX standard is an attempt to facilitate this cross pollination and composition of models.
Creating Artificial Intelligence
“Biology is a software process. Our bodies are made up of trillions of cells, each governed by this process. You and I are walking around with outdated software running in our bodies, which evolved in a very different era.†- Ray Kurzweil
In his book, “How to create a mindâ€, Kurzweil documents the progress engineers have made in replicating complex, albeit very narrow tasks, by emulating the structure of neurons in the brain with dendrites, axons and activation functions.?
In 2015 Deepmind's AlphaGo surprised the world when it was able to exceed human capabilities in the case of a game called Go. Subsequently this success has been reproduced on other similar gaming scenarios. Google's Deepmind had figured out how to codify game theory using a branch of AI referred to as Reinforcement learning, which is not different than being parented or coached in a community. You get rewarded for "good things" and "punished" for bad things. Over time you learn what to do and what not to do. You also learn cause and effect and the concept of deferred rewards.
领英推è
The adjacent machine learning field referred to as "Deep learning" or DNN involves bombarding these neural network with data, measuring the outcomes and adapting to make the outcomes closer to what we wanted. It can be thought of as fast-forwarding a video tape of history and learning based on what you observed.
The borrowing from mother nature doesn’t end with basic neurons. Modeling after neuroscience research done with cats and monkeys on how the biological brain segments visual understanding, a breakthrough in computer vision came from being able to decompose the problem into detection lower level features first, then building up to more complex objects. The Convolutional Neural Networks architecture was created.
Another example exists in the case of Natural Language Understanding. To augment networks ability to memorize, like when you reading this sentence, configurations like Recurrent Neural Networks have been designed.
?While we have made significant process in point solutions, and while architectures and models are continuously be suggested and tested (much like the variety of neurons in the brain), we still have ways to go to compete with the brain's diversity and ability to generalize.
From a raw numbers point of view, we have roughly 100 billion neurons each connected to 1000's of other neurons. That is one trillion connections that all must be trained. Contrast that with the largest DNN trained to date reaching the one billion mark for parameters.
Once we reach the trillion-parameter mark it is foreseeable that we can get close to approximating much of what the brain does...at least academically. I say academically because recent research has demonstrated that the communication between neurons extends beyond dendrites and axons; they seems to also have an electromagnetic wave component. Something that remains to be modeled.
The AI Opportunity
“Artificial intelligence is the new electricity†– Andrew Ng
I tend to agree with Prof Ng. Just as electricity is ubiquitous and at the heart of innovation, including AI itself, I can see a world where AI is woven into all products and services making everything, just that much smarter and more useful.
However, if we were to use the Geoffrey Moore’s business life-cycle, I think it would be safe to say we are in the early adopter’s phase.
Outside from the work done in a select few areas, the real-world application of AI remains tenuous and far from democratized.??The most ubiquitous AI feature in use today seems to be in cameras and phones and creates the green bounding boxes around faces. Followed presumably by NLP used in digital assistants like Google Home, Alexa and Siri.
However, I do believe that as AI capabilities become more established that applications based on these capabilities will start to emerge at scale and begin to accelerate the promise of AI. Industries like automotive, healthcare, finance, public safety and telecoms are all great examples of this acceleration. We are already seeing the fruition in self-driving cars as more and more companies introduce some form of autonomous driving. In healthcare Google‘s AI is showing amazing results in diagnosing health conditions. Skype’s ability to do speech to text and real-time translation are great examples of transcending language barriers in telecom.
But this is still just the beginning of world where almost every facet of what we do has been enhanced with AI.
Conclusion
"Education is the most powerful weapon which you can use to change the world†- Nelson Mandela
AI is both ethereal and strangely familiar.?On the one hand, AI covers a large scope of highly scientific algorithms, math and a touch of art. Yet on the other hand the process is largely no different than the process that yielded life as we know it today.
As I watch the deep learning algorithm crunch away epoch by epoch, I can’t help being struck by the similarity. Billions of years of natural selection being replayed in minutes and hours. A very simple process of trail, error and adaptation fast forwarded through time.
Regardless where you stand on the philosophical debate on whether super intelligence will or can be achieved or if singularity will occur, one thing is certain, the ability to “learn†and therefore become ‘more educated’ is no longer constrained by biological boundaries.?
?If you believe in the virtues of education then you must believe in the potential of AI.
Professional Website Developer with 7+ Years of Experience
9 个月Ian, thanks for sharing!
Senior VP at QuSmart.AI | Master Data | Blockchain | "Leading to Serve"
3 年Robotic consciousness is not constrained by nature… Charlie Northrup
Pioneering AI & Emotional Intelligence | CEO, Vibeonix | Transforming Emotional Well-being with Voice-AI Integration | Helping Organizations Harness Emotional Data
4 年What a great article. I agree when he says- Regardless where you stand on the philosophical debate on whether super intelligence will or can be achieved or if singularity will occur, one thing is certain, the ability to “learn†and therefore become ‘more educated’ is no longer constrained by biological boundaries. If you believe in the virtues of education then you must believe in the potential of AI.
CCaaS on Azure. Software Platform Innovation. Microsoft Azure Co-sell. Agility, Adaptability, and Value Creation through Collaborative Co-selling.
4 å¹´Insightful article, Ian. I like your closing statement that "If you believe in the virtues of education then you must believe in the potential of AI." I have never really thought of it like that. Thank you!