The Future of Computing
Beyond Moore's Law into a New Era
Moore's Law has now reigned unchallenged for 50 years, and just as predicted the number of transistors has grown exponentially thereby creating an information revolution. But now we are approaching physical constraints where under the current paradigm an exponential increase can no longer continue. Soon it will no longer be possible to further reduce the size of transistors and then Moore's Law is finished. The saturation of microprocessor speed occurred in 2005, and while Moore's Law still chugged along despite performance saturation it became clear that a new paradigm was needed to continue advancing computing technology and experiencing exponential growth.
This doesn't mean the end the information revolution. The price performance of computing will continue it's exponential advance as new forms of processing are developed. Humans have a knack for developing just in time solutions that have continued the exponential growth of information technologies for hundreds of years.
In "Rebooting Computing: Developing a roadmap for the Future of the Computer Industry" the IEEE Computer Society suggests that the world technical community must rethink computing completely in light of the end of Moore's law as we know it. A holistic approach is considered, taking a kaleidoscopic view of the industry through multiple lenses and covering all aspects of computing. The paper suggests that the next decade will see a rebirth of throughout the computer industry through a complete redesign from top to bottom of both hardware and software. This reformulation will allow a continuation of the exponential growth of data processing capacity while maintaining an industrial and information revolution. At the center of the IEEE effort is the convergence of two complementary initiatives, these are the:
- IEEE Rebooting Computing Initiative (RCI) and the
- International Roadmap for Devices and Systems (IRDS)
The term "Rebooting Computing" was coined by IEEE Life Fellow Peter Denning as part of his National Science Foundation-sponsored initiative to revamp computing education. Tom Conte also independently began thinking in these terms, and this became the inspiration for a 2012 IEEE Future Directions working group to rethink the computer. After getting agreement from Peter Denning, the working group took the name IEEE Rebooting Computing Initiative (RCI).
The IRDS represents the next phase of work that began with the partnership between the IEEE RCI and the International Technology Roadmap for Semiconductors 2.0 (ITRS 2.0). Now with the launch of the IRDS program, IEEE is taking the lead in building a comprehensive, end-to-end view of the computing ecosystem, including devices, components, systems, architecture, and software.
It was decided that any change had to be fundamental and incorporate changes all the way up the Computing Stack, from the device level, to circuits, to architecture, on up to algorithms and applications themselves. The von Neumann architecture itself could no longer be propped up, and everything in the computer needed to be re-thought, from top to bottom. In order to accomplish this they identified 3 pillars of future computing that are necessary to achieve this vision: Security, Energy Efficiency, and Human-Computer Interface/Applications.
IEEE created the International Conference on Rebooting Computing (ICRC) and built on a series of 4 RCI Summits. The first, dubbed RCS 1, was held in Washington, DC in December 2013, and included thought leaders from major government agencies, the White House Office of Science and Technology Policy, industry giants and accomplished academics. That exercise produced the Three Pillars. RCS 2 and RCS 3 were held in the Silicon Valley area in 2013.
RCS 2 looked at new engines of computing, while RCS 3 dove into security and algorithms. RCS 4 focused on several alternative tracks, including Extending Moore’s Law through novel devices (such as tunneling FETs, memristors, spintronic elements, and carbon nanotubes), Approximate/ Probabilistic Computing, Neuromorphic Computing, and Superconducting Computing. Then finally the 1st IEEE International Conference on Rebooting Computing (ICRC) was held in San Diego, California, October 17-19, 2016.
This work has culminated in the Rebooting Computing report and the realization that a new way of computing is called for in order to continue the advance of technology. Also it is recognized that to continue to improve performance, we will need to reboot computer technology itself at every level. This does not require discarding everything we have developed, but rather taking a second look at a variety of evolutionary and even revolutionary approaches that offer ways through and around the current limitations.
In May 2016, IEEE launched the IRDS sponsored by the Rebooting Computing Initiative in consultation with the IEEE Computer Society, which represents the next phase of work that began in early 2015 with the initial partnership between the IEEE RCI and ITRS 2.0. With the launch of the IRDS program, IEEE is taking the lead in developing the roadmap and building a comprehensive, end-to-end view of the computing ecosystem, including devices, components, systems, architecture, and software.
IRDS plans to deliver a fifteen year vision encompassing systems and devices, and setting a new direction for the future of the semiconductor, communications, networking, and computer industries. Ultimately this may create a new kind of Moore's Law around computing performance and hopefully accelerate new computing technologies being brought to market.
The Future of Computing
The past few years, like many other organizations, the IEEE Computer Society has made predictions on technology trends for the coming year. After reviewing their 2016 predictions the IEEE Computer Society has laid out nine technology trends that will reach adoption during 2017. They are:
1. Industrial IoT
With many millions of IoT sensors deployed in dozens of industrial strength real-world applications, this is one of the largest and most impactful arenas for big data analytics in 2017.
2. Self-driving Cars
In Silicon Valley, one can easily see up to three self-driving cars on the same street. While adoption is less likely in general use, the broader adoption will likely occur in constrained environments such as airports and factories.
3. Artificial Intelligence, Machine Learning, Cognitive Computing
These overlapping areas are a fundamental requirement for big data analytics and for other areas of control and management. Machine learning, and deep learning in particular, are quickly transitioning from research lab to commodity products. On the software side, advanced engines and libraries from industry leaders, such as Facebook and Google, are making it to open source. On the hardware side, we see continually improving performance and scalability from existing technologies (CPUs and GPUs), as well as emerging accelerators. Consequently, writing domain-specific applications that can learn, adapt, and process complex and noisy inputs in near real time is easier than ever and a wide range of new applications is emerging.
4. 5G
While it is unlikely that 5G will have immediate adoption in the next year, its roadmaps and standards are being developed, influencing the applications that will eventually evolve. Also, some early-use cases of deployment are being pursued.
5. Accelerators
While looking at the long term, the ending of Moore’s law is being addressed by novel technologies such as those covered by rebooting computing (see bullet 1 in 5 Year Trends below), heterogeneous computing founded on accelerators enables the stretching of performance boundaries in today’s technologies.
6. Disaggregated Memory – Fabric-attached Nonvolatile Memory (NVM)
While NVM has achieved mixed success in productization in the past year, the number of companies working in this arena, be it on materials, architecture, or software, makes it a certain candidate for imminent adoption. Fast, nonvolatile storage bridges the gap between RAM and SSD's, with a performance-cost ratio lying somewhere in between. This fast, nonvolatile storage will be initially configured either as “a disc,” accessed by the OS like any other permanent storage device, or as “RAM” in DIMM slots, accessed by the OS as memory. But once the hardware and OS support is fully figured out, this technology will open the door to new applications that aren't currently available.
7. Sensors Everywhere and Edge Compute
From smart transportation and smart homes, to retail innovations, surveillance, sports and entertainment, and industrial IoT, we are starting to see intelligence being aggressively deployed at the edge. With intelligence comes the need to compute at the edge, and a variety of edge compute offerings are opening up new disruptive opportunities.
8. Blockchain (beyond just Bitcoin)
While commonly known as the technology behind Bitcoin, Blockchain has far more disruptive uses, potentially changing the way in which we implement processes like voting, financial transactions, title and ownership, anti-counterfeiting, and digital rights managements, securing these processes without the need (and bottleneck) of a central authority.
9. Hyper-converged Systems
Also known as “software-defined everything,” hyper-converged systems are bundles of hardware and software that contain elements of compute, storage and networking together with an orchestration system that lets IT administrators manage them using cloud tools and dev/ops practices. While they have been on the roadmap for major IT players for the last three to five years, there are major adoption trends that may cause their growth to explode in 2017.
I think they have done a very good job at laying out the technologies that are likely to achieve widespread adoption this year. And this will create a strong foundation to build the infrastructure that will launch us into a new era. Over the next three years all of technology will be impacted as computing is radically redesigned. The exponential growth of technology will continue and we will see amazing possibilities coming to fruition. And quite possibly as Ray Kurzweil predicts, reaching out in the five to ten year time frame, we will see a convergence of technology and biology alongside the creation of human-level artificial intelligence.
The talks below were given at the January 2017 Asilomar conference (also called Beneficial AI) organized by the Future of Life Institute. It was there that the Asilomar AI Principles were developed, ranging from research strategies to data rights to future issues including potential super-intelligence, and signed by over 2,500 leaders in the field.
Ray Kurzweil gives a historical perspective of the advancing of computing technology over the past fifty years. He credits deep neural nets along with the law of accelerating returns and the exponential progress of information technology for being responsible for the recent advances in artificial intelligence. As the price performance of computing continues to grow exponentially we will see new paradigms.
Yann LeCun proposes that supervised (deep) learning is insufficient for real AI. We build models of the world through predictive unsupervised learning. Predictive unsupervised learning is predicting any part of the past, present or future percepts from available information. Generative Adversarial Networks show promise for AI in learning predictive models.
Taking into consideration the exponential growth in the price/performance of information technologies, along with the new infrastructure available through the advance of computing hardware we are going to see a very different world three years from now. The biggest obstacle or cultural, not technological. Society will need to adapt to the new reality as machines play a larger role in everything. This is going to change the way we live, work and play. Do not be afraid of the future, embrace it.
Reshaping Healthcare with Next Generation Technology
8 年Leonard, what is your take on the notion of "reboot compute?"