History of Computing?-?how major breakthroughs happened at the confluence of hardware and?software

I had always marveled at how closely inter-twined hardware and software systems had been. Especially being a professional who holds a degree in Electronics and Communication Engg and had started his career in embedded systems but later moved to Information Technology as whole, focusing on Intelligent Systems gave me a ringside view of software-only aspects, hardware-only aspects and combined hardware-software aspects. The general trend in non-embedded space is to abstract out the hardware in form of operating systems and frameworks so that the software developers need not worry about the underlying hardware. In the same way, hardware designers need not think too deeply about the all possible software applications that their hardware can enable. This abstraction usually works well for non-embedded systems. However, all embedded systems people will recognize the need for hardware-software co-design or designing hardware-aware software for creating efficient systems.

This got me thinking about how important is this paradigm of hardware software co-design or hardware-aware software design, even in non-embedded space? Is it always prudent to separate the hardware and software design and development, especially for impactful innovations?

If we carefully look into the history of computing, it is seen again and again that major breakthroughs happened at the confluence of path-breaking work in both hardware and software. While the individual innovative works in hardware and software might have happened in different times, very much like a successful marriage, both of them came together to create the right utility and impact at scale. Here are some examples.

1) George Bool, Ada Lovelace and Charles Babbage - all three worked in the first half on 19th century. George Bool developed the theory of Boolean Algebra, the very basis of digital systems, Charles Babbage created the first digitally controlled mechanical computer hardware and Ada Lovelace created the first software program that utilized Boolean logic to run on Babbage's computer. It is really fascinating how these three seemingly unrelated work in hardware and software (Ada Lovelace did work with Charles Babbage) came together to create one of the most impactful inventions of human era - the digital computer.

2) But we had to wait for almost another century (mid 20th century) for the computer technology to become implementable and create impact. This again happened through two seemingly unrelated hardware and software breakthroughs - William Shockley invented transistor, which made it possible to create miniaturized compute elements and Alan Turing gave the idea of a Turing Machine, which for the first time tried to formalize software and algorithm development.

3) But this still did not democratize the computing. It again took a series of seemingly different developments in hardware and software over the next 3-4 decades to create a scalable model of democratized, affordable computing - the personal computer or PC. On hardware side, it was Intel led by Robert Noyce and Gordon Moore that created the microcontroller 808x series and Texas Instruments led by Jack Kilby that created integrated circuit based digital calculator. But it took a parallel innovations in operating systems space via Unix/Macintosh from Apple and DOS/Windows from Microsoft along with subsequent standardization of the hardware by IBM which really brought computing to everybody's home.

4) Next we saw the evolution of Internet followed by mobile phones as computing and communication device. Both internet (Routers, gateways, switches, Base stations running diverse set of networking and communication protocols) and mobile phones (compute and communication hardware running Android / iOS) are great examples of the same hardware-software co-design / hardware-aware software design paradigm.

5) Today we see the history repeat itself in the context of Artificial Intelligence (AI) and Machine Learning (ML). Most of the software / algorithm theory of AI/ML was developed in 1990's / early 2000. But AI never got traction at that time because compute hardware capability that time was not good enough for scaled performance. Only in the last decade did the hardware innovation like GPUs and TPUs matched the software requirement and some great hardware-software co-design efforts in this space gave us working, deployable AI systems.

6) Going forward, we see the same hardware-software co-design / hardware-aware software design story repeating again as we work on ongoing AIOT and Edge Computing systems and in futuristic Neuromorphic Computing and Quantum Computing Systems.

Each of the examples cited above deserve a deep dive into the actual principles of hardware-software co-design / hardware-aware software design principles employed in each. May be, I will write about some of them at a later time. But there is no doubt that each of these remarkable milestones in computing history are fascinating stories of joint innovations in hardware and software. And it also reminds us not to ignore the underlying hardware while writing the software and not to think about the potential software applications while designing a specific hardware. Do you have any other such examples? Do leave your thoughts in the comments section.

Agreed that hardware and softwares evolving at different speeds. Moore’s law is well known. Some time back listened to an interesting keynote (ACM 97) by Nathan Myhrvol on laws of software. He formulated four laws (on lighter note) for software evolution. However, it was presented long time back, still hold true. Law 1: Software is a gas – It expands to fit the container it is in Law 2: Software grows until it becomes limited by Moore’s law Law 3: Software growth makes Moore’s law possible ??? Law 4: Software is only limited by human ambition and expectation. Feel the both are complimentary, without good hardware, software may not be effective and vice versa. Key is finding the right balance.

Shibasis Ganguly

Technology consultant

2 年

Very nice article. I would also like to mention after point 4 that 1985 ARM launched successful RISC architecture with associated SW tools ,opensource SW community, single board computers really made the learning in recent time very exciting. Engineers graduating with more practical experience. It has made a significant impact in Embedded computing.

Arpan Pal

Distinguished Chief Scientist and Research Area Head, Embedded Devices and Intelligent Systems, TCS Research | Senior Member, IEEE and ACM | Associate Editor, ACM TECS and Springer SNCS

2 年

要查看或添加评论,请登录

Arpan Pal的更多文章

社区洞察

其他会员也浏览了