Evolution of Computers
This article will present you an intriguing journey in the world of computers and let you navigate through the history and know some groundbreaking discoveries that ultimately led to the Wintel (Personal computer running on an Intel processor with windows as the operating system) which you may be now using on your desks or laps. By acknowledging these facts, we can certainly get to know how quickly have computers evolved and how fast the technology of computing has been taking leaps for the past 5000 years and the present.
Computers are the lifeblood of today's world. If they were to suddenly turn off, all at once, the power grid would shut down, planes would crash, many scientific researches would pause, stock market would freeze, trucks with food wouldn't know where to deliver and employees wouldn't get paid. Even many non-computer objects like the chair or desk you are sitting on right now are made in factories run by computers. Computing nearly transformed every aspect of our lives. We are living in a time likely to be remembered as the electronic age. With billions of transistors in your smartphones, computers can seem complicated. But they're simple machines that perform complex actions through many layers of abstraction. Although electronic computers are relatively new, the need for computation is not.
The Early Computing
The earliest recognized device for computing was the abacus, invented in *Mesopotomia around 2500 BCE. It's essentially a hand operated calculator that helps to add and subtract many numbers. The abacus was created because the scale of society had become greater. There might be thousands of people in a village or tens of thousands of cattle. Fast forward to 1642 and the abacus evolves into the first mechanical adding machine, the Pascaline, built by mathematician and scientist, Blaine Pascal. Many other computing devices then took shape like the astrolabe, which enabled ships to calculate their latitude at sea or the slide rule for assisting with multiplication and division. And there were literally hundreds of types of clocks that were used to calculate sunrise, tides and even just the time. However, none of these devices were called computers. The first use of the word 'computer' was for a job title. Computer was a person who did calculations at a workplace. The job title persisted until the late 1800s when the meaning of computers started shifting to refer to devices. Notable among these devices was the Step Reckoner, built by Gottfried Leibniz in 1694. Leibniz said, "It is beneath the dignity of excellent men to waste their time in calculation when any peasant could do the work just as accurately with the aid of a machine." Inspired by Pascal, he created his own calculating machine able to perform all four arithmetic operations. He was also the first to lay down the concepts of binary arithmetic. From childhood, we are taught how to do arithmetic in base 10. However, there are possibly an infinite number of ways to represent information, such as octal as base 8, hexadecimal as base 16 used to represent colors, base 256 which is used for encoding and the list can go on. Binary is base 2, represented by 0s and 1s. We'll later explore how binary does the job for a PC. Unfortunately, even with mechanical calculators, most real world problems required many steps of computation before the final answer was derived. Also, these machines were expensive and not accessible by most of the people. So, before 20th century, most people experienced computing through pre-computed tables binded by those amazing 'human computers' who were mentioned above.
*The exact origin of the abacus is still unknown.
Leibniz's Step Reckoner
Speed and accuracy is important on the battlefront, and so militaries were among the first to apply computing to complex problems. One of the difficult problems was to accurately fire artillery shells over long distances with varying wind conditions and atmospheric pressure. Range tables were created that allowed gunners to look up environmental conditions and distance they wanted to fire. The tables would tell them the angle to set the cannon. The problem was if you changed the design of the cannon, a whole new table had to be computed which was tedious and error-prone. Charles Babbage acknowledged this in 1822 and proposed a new mechanical device called the Difference Engine, a much more complex machine that could even approximate polynomials by using steam power and printing the results in a table. This described the relation between several variables like range and air pressure. Polynomials could also be used to approximate logarithmic and trigonometric functions. The construction of the engine took around two decades with an assembly of around 25000 components weighing almost 15 tons. Eventually, it was abandoned but worked fine when historians finished constructing the same from Babbage's drawings in 1991. During construction of the difference engine, Babbage imagined an even more complex machine, the Analytical engine. Unlike the difference engine, this machine would be able to execute operations through the addition of conditional control, store memory and read instructions from *punch cards making it a programmable mechanical computer. Also worth mentioning is Ada Lovelace who worked very close with Babbage. She is considered the world's first programmer and came up with an algorithm that could calculate Bernoulli's numbers which was actually designed to work with Babbage's machine. She also outlined data analysis, looping and memory addressing. The Analytical engine inspired the first generation of computer scientists who took in many of Babbage's ideas in their machines. This is why Babbage is considered as the 'father of computer'. You can know more about Babbage's machine from this TEDx talk - 'The Greatest Machine That Never Was: John Graham-Cumming at TEDxImperialCollege'.
*Punch cards - These cards have holes and notches cut in them, especially for storing data, that can be sorted according to the combinations present or absent. Such a card having around 80 columns and 12 rows was used in early mainframe computers. It's also called as Hollerith card after Herman Hollerith. It acts as storage unit for information and to convey data as it is introduced into a processing machine. A detailed explanation of punch cards is given in this video on YouTube - PUNCHED CARD DATA PROCESSING INTRODUCTION IBM 029 COMPUTER 62454.
The Analytical Engine. Image Credits: Reddit
By the end of 19th century, computers were being used for various purposes in sciences and engineering but rarely for business and government tasks. However, the US government faced serious trouble for the 1890 census which was being taken once in 10 years. The population was booming due to immigration and the 1890 census would take 13 years for getting it done. This is when the census bureau turned to Herman Hollerith who had built a tabulating machine which was electro-mechanical. It used mechanical systems for counting like Leibniz's step reckoner but coupled them with electrically powered components. His machine used punch cards which are mentioned just above and proved to be 10 times faster than the manual tabulations. The census was completed in a mere two and half years saving lot of resources for the census department. At this stage, even businesses began to notice the power of computing and started utilizing it to boost the profits. To meet this demand, Hollerith founded the Tabulating Machine Company which later merged with other machine makers in 1924 to become the International Business Machines Corporation or IBM which you have probably heard of.
Top: Herman Hollerith's Tabulating machine Bottom: Punch card.
Early 1900s set the stage for some big innovations in the timeline of computers, especially the 1930s and the 1940s. In 1936, Alan Turing proposed the concept of a universal machine well-known as the Turing machine that is capable of computing anything that is computable. Up to this point, machines were able to do certain tasks. The concept of modern computers is mostly based on Turing's ideas. In addition, German engineer Konrad Zuse invented the world's first programmable computer in the same year. This device read instructions from punched tape and is the first to use *boolean logic and binary to make decisions by using relays. Zuse later used punched cards to encode information binary, making them sort of the first memory storage devices. Jumping into 1942, we have the world's first commercial computer, the Z4 unleashed by Zuse himself.
*Boolean logic - A logic that results in either a true or a false output or when corresponding to binary, a one or a zero.
Konrad Zuse stands next to a replica of his Z3 computer at the Deutsches Museum in Munich. Zuse himself helped recreate the Z3 replica, the original having been destroyed during WWII in 1943. The museum has several of Zuse’s pioneering computers on display. Image credits: Deutsches Museum
The Modern Computing
In the first half of the 20th century, the scale of the human systems began to increase at an unprecedented rate. Global trade and the need for computation reached new heights as we even had plans of visiting other planets. As a result, the size of the then computers also increased and became more prone to errors since they were then electro-mechanical. In-fact, this tiny period was completely dominated by the relays which were considerably slow and unreliable. As a side note, this huge and warm machines also attracted insects. In September 1947, Grace hopper, one of the operators of a calculating machine named 'Harvard Mark-II' found a dead moth from a faulty relay. From then on, when anything goes wrong with a computer, we say it has *bugs in it.
*This is the origin of the term 'Computer bug'. Hopper also coined the word 'Debugging'.
The Vacuum Tube era
This is where actually the modern computing began. *Vacuum tubes are fully digital and are the origin for boolean logic unlike. This technology proved to be reliable and less unlike relays. The first digital computer, 'The ABC(Atanasoff-Berry Computer)' was unleashed in 1942. It used vacuum tubes(unlike those built by Zuse) and could solve up to 30 math equations at a time. Owing to the World war-II, the Colossus(first programmable digital computer) was built in assistance of Alan Turing to break the German *crypto codes. In 1946, 'The Electrical Numerical Integrator and Computer' aka 'The ENIAC' was built out of nearly 18000 vacuum tubes and was large enough to fill a room. It is considered to be the first successful high speed digital electronic computer. Meanwhile, mathematician John Von Neumann elaborated on how computers should be organized and built. He also formulated neumann architecture which includes conditional addressing and sub-routines. Instructions can now be modified, stored and coded using binary just like data. Neumann also assisted for the construction of 'The EDVAC'(Electronic Discrete Variable Automatic Computer) which was completed in 1950. It is the first stored program computer and was able to compute about 1000 instructions per second.
*You can refer the film 'The Electronics at work' from this link https://www.youtube.com/watch?v=hwutHPYGgfU and know about the working of Vacuum tubes.
*Not to be confused with Turing's 'Bombe' that actually solved Germany's 'Enigma'.
Image Credits: NPTEL
We can now notice that the computing has actually evolved into its own field. From mechanical gears to electro-mechanical relays that took milliseconds and then to vacuum tubes that took microseconds. From binary as a way to encode information as punched cards to being used with boolean logic and represented by physical technologies like relays and vacuum tubes to finally being used to store instructions and programs. From abacus as a way to count, Pascal's mechanical calculator, the theories of Leibniz, Alan Turing and John Von Neumann, the vision of Babbage and the intellect of Lovelace, George Boole's contribution of Boolean logic, the progressive inventions of programmable calculator to a stored-program digital computer and countless other inventions, individuals and groups. Each step paved a way in further accumulation of knowledge. It was a joint contribution taking over 5000 years and more so between 1800 and 1950.
The Transistor Era
Vacuum tubes were great when compared to relays but still were problematic. For example, nearly 50 out of 18000 vacuum tubes in ENIAC would burn out and in-fact these were the reason for computers still being very large and consuming huge amounts of power. In 1947, the first silicon *transistor was invented at Bell labs and by 1954, the first digital computer was invented out of it called as TRADIC(Transistorized Digital Computer) for the U.S. air force. It contained 800 transistors, occupied a space of 0.85 cubic meters and only took 100 watts of power performing one million operations per second. In this era, many new inventions took birth both on the hardware and software side. Considering the hardware, the first memory device, the random-access magnetic core store was introduced in 1951 by Jay Forrester. This was the beginning of what is now known as RAM. The first hard drive was introduced by IBM in 1957. It weighed one ton, storing around 5 mega bytes and costed around 27000 dollars per month in today's revenue. On the software side side, 'Assembly' was the first programming language introduced in 1949. It was the way to communicate with the machine instead of machine language or binary. But the first truly and widely used programming language is 'Fortran' invented by John Backus in 1954. Assembly is a *low-level language and Fortran is a high-level language. In the early 1950s, it was very expensive to compile the assembly code back to machine code. This all changed when Grace Hopper developed the first computer *compiler for a language called 'COBOL'(Common business-oriented language). From then on, computing was no more time consuming.
*In low-level languages, although you are not writing instructions in machine code, you need a very deep understanding of computer architecture to execute a program.
*Compiler - A program that converts instructions into a machine-code or lower-level form so that they can be read and executed by a computer.
The invention of the first transistor; BJT. Source: Google images
The Era of Integrated Circuits
This era is the beginning of exponential rise in performance of computing. Although transistors were up to the mark, the computers were still discrete i.e. all the components had to be physically soldered. As a result, the more complex computers became, the more complex were the connections to be made among the components which in-turn increased the number of faults in the wirings. To transform the whole dimensions, Jack Kilby of Texas Instruments came into the picture by introducing the Integrated circuit. It is a way to pack many transistors into a single chip. This also reduced power consumption as no more wiring was required. Integrated circuits were the start of a hardware revolution in the development of many other electronic devices such as the mouse invented by Douglas Engelbart in 1964. He also demonstrated the first graphical user interface aka GUI. Computer speed, performance, memory and storage began to increase as the ICs could pack more and more transistors into smaller surface areas. The floppy disk was invented in 1971 by IBM and DRAM(Dynamic RAM) was introduced by Intel in the same year. On the software side many programming languages were introduced such as 'BASIC' in 1964 and the 'C' in 1971 at AT&T Bell labs. As you can see, from 1900, computing has evolved at a very fast rate. In 1965, Gordon Moore, one of the founders of Intel made one of the greatest predictions in human history: Computing power would double roughly every two years at lower costs and that computers would become so small that they could be embedded into homes, cars and what he referred to as personal portable communication equipment(Now: Mobile phones).
IC complexity over the years. Source: NPTEL
It doesn't look like much, but this 7/16 by 1/16 inch slab of germanium would change the world forever. Courtesy of Texas Instruments.
So this is it for this article and by now you can comprehend the advancements of technologies that raised the potential of the modern day computers to a fair degree. You can find the links to various other resources in the post. Stay tuned for more articles on Electronics and Computer Networking. Have fun!!
Software Engineer at Cisco
4 年Good!...whooping amount of info??thank you.