Why Deep Learning Excites Me
Michael Spencer
A.I. Writer, researcher and curator - full-time Newsletter publication manager.
I'm not a Big Data expert, but I sense "Deep learning" is part of a critical advancement in the realization of artificial intelligence (AI). This trend in machine learning has been a Big Data keyword for quite some time and promises more powerful, faster and more intuitive leaps of theoretical arguments:
- Theoretical arguments from circuit theory
- Reverse engineering of Neuroscience
- Learning motivated by intuitive inferences
- Multi-layered neural networks
- Has had favourable results in ML applications
- Using effectively knowledge extracted from unlabeled data
- Improved extraction of significant structure from a data set
Think of DL as the "instincts" and foundation logic of systems thinking. For example, in Chess, "deep" concepts might be position-independent notions of "capture" or "in check", these adds additional layers of learning.
One way to see Deep learning is as increased "hacking" of machine-learning algorithms. It's not as flat as a normal algorithm, and is one step close to symbolic ideas from earlier days of AI that have been ignored for more numerically influenced Machine learning field predominance.
Attempting to model high level abstractions in data, is well, more intelligent! This kind of deep structural learning is what can potentially allow machine learning to make "jumps" in its learning to a more autonomous AI like state. Training neural networks to the point where they can train themselves on many layers is obviously one of the end games.
Features of deep learning, could allow algorithms to have predictive capabilities. As it allows you to find connections between variables and packaging them into a new single unified variable in a process of "feature engineering".
This is all about scaling learning algorithms towards AI. The "hidden layers" add depth to the feature hierarchy, with the potential for increased abstraction, with each subsequent layer acting as a filter for more and more complex features that combine those of the previous layer. In theory, there could be some "exponential" increase of the intelligence of these systems with increasingly and improved ability to handle unsupervised data and data-win conversions.
Deep learning could afford "real-time" processing as multi-layered internal structures could give immediate interpretations and increased "agency" to the machine-learning "black box". Where machine learning is a black box with a "reality" of the data you put inside. Deep learning will one day become a self-managing black box. In a sense, that's what the human brain is and does.
Contemporary winners of deep learning, are those corporations and institutions with the processing power and GPU support to deal with training phases that have large time courses. Google, rich universities and the like.
Deep learning, as a concept captures human learning systems in a more human way of thinking and processing data. In 2015, we have the computation power in terms of processing speed, but still require the brain success rate of machine-learning calculations to be improved. The non-linear processing units (intermediary layers between input/output) are essential for the composition of deep learning.
The has huge ramifications for robotics, AI, SaaS startups, smarter Apps, is nearly endless. Deep learning represents an interface point between machine learning and Big Data, like nothing else currently on the scene. Google invested $400 million in DeepMind in 2014. You know, skynet is coming. All jokes aside, look up "Deep Learning" on Ted Talks and stay tuned.
I.T. degree professional pursuing Cloud Security Foundation | CISSP Certification-Univ of Maryland // UMGC
9 年Impressive, I concur with Ali Anani, Phd,,, Thank you for sharing this.
Columnist at BIZCATALYST 360
9 年Michael Spencer- nice post. Big Data is the new trend and it pays off. I use neuroscience in profiling customers and it pays dividend. I therefore enjoyed this post tremendously.