How humanity got into programming through 20 quick checkpoints [it took a darn century]
AI-generated image w/ sci-fi theme | Source: Bing Image Creator (powered by DALL·E; prompted by Doron Brayer)

How humanity got into programming through 20 quick checkpoints [it took a darn century]

Welcome to the first sprint in the history of computer science. It took humanity a whole century to reach the point where we can sit back and watch twenty-somethings play games and make a fortune out of it (not that I'm complaining, I'm a long-time gamer myself... but not twenty-something ??). Back then, we didn't have the luxury of scrum and agile methodologies, so progress was not so rapid. However, humanity's "waterfall model" proved to be quite alright, as we eventually got world-changing innovations.

This is my first ever article, but don't let that deter you! It's totally worth your time! You'll meet the trailblazers of CS (not Counter-Strike, but Computer Science, silly you), each with their own innovative devices and genius minds. They may not wear spandex or swing around between skyscrapers, but they're still the superheroes of computer science—with a math notebook in their pocket. Each of these devices is like Iron Man's suit, but unwearable, uncolorful, unfriendly—yet unreal (in a good way).

HEADS UP! The analytical engine of Charles Babbage and the mathematical notes of Ada Lovelace have been excluded from the main content (the big 20). Although Babbage's analytical engine was groundbreaking, it was never completed, and Lovelace could not (and did not!) fully utilize her pioneering notes via an incomplete device, sadly.

Boole's algebra of logic, 1854

George Boole, an English mathematician and philosopher, introduces his algebra of logic via his paradigm-shifting book An Investigation of the Laws of Thought in November 1854 at Queen's College in Cork, Ireland, where he is a professor of mathematics. His system uses mathematical symbols and equations to represent logical statements, and allows for the manipulation and simplification of complex logical expressions. Boole's work lays the foundation for modern digital electronics and computer science by providing a rigorous mathematical framework for reasoning about logic circuits and algorithms. Boole's logic continues to be used today in the design of digital circuits and computer algorithms, and he is recognized as one of the fathers of computer science.

Cover of the book An Investigation of the Laws of Thought, by George Boole
The original cover of the paradigm-shifting book "An Investigation of the Laws of Thought" by George Boole, published in 1854 | Source: Wikimedia Commons

FUN FACTS

FF01: In Boole's opinion, his own algebra of logic belonged to the field of philosophy, as it was inspired by his deep interest in that field. He had a desire to find a systematic method for expressing and manipulating logical concepts. It was only later that his work became foundational to the field of computer science.

FF02: Boole was a self-taught linguist who mastered Latin, Greek, French, German, and Italian by the age of 14. He also had a passion for education and founded two schools in Lincoln when he was in his twenties. He was not only a brilliant mathematician, but also a polyglot and a teacher.

Jevons' logic piano, 1866

William S. Jevons, an English economist and logician, finishes designing his logic piano in 1866 at his laboratory at University College London (UCL), England. The piano is a mechanical device that uses keys and switches to perform logical operations, and can be programmed to perform complex computations. Although it's not practical for general-purpose computing, it demonstrates the potential of using machines to automate logical operations and paves the way for future developments in computing.

No alt text provided for this image
Replica of Jevons' logic piano that was made for an exhibition at the Powerhouse Museum in Sydney in 2004. Photo taken on Jan 2013 | Source: Flickr

FUN FACTS

FF01: Jevons was not only a logician and an economist, but also a meteorologist and a photographer. He studied the weather patterns in Australia and England, and he invented a device called the "logic of variation" to measure the variations of the barometer. He also took many photographs of his family, friends, and colleagues, some of which are preserved in the archives of the University of Manchester.

FF02: Jevons' logic piano was inspired by Boole's algebra of logic, implementing Boolean logic in the machine to solve problems faster than humans.

Hollerith's tabulating machine, 1888

Herman Hollerith, an German-American inventor and statistician, completes the development of his tabulating machine in 1888 while working for the U.S. Census Bureau in Washington DC. The machine uses punched cards to store and process information, and later is used for the 1890 U.S. census to dramatically reduce the time required to tabulate the data. Hollerith's machine marks the beginning of the era of automated data processing and is a key precursor to modern computers.

No alt text provided for this image
Reconstructed model of Hollerith's tabulating machine commissioned by IBM Italia and exhibited at Museo della Scienza e della Tecnica di Milano. While based on the original design, this model is not identical to the one used for the 1890 U.S. Census, as it features 3 components only | Source: Wikimedia Commons

FUN FACTS

FF01: Hollerith's tabulating machine was inspired by train conductor's punch cards, aka ride tickets. Hollerith was looking for a way to automate the processing of census data, which took too long by hand. He realized that he could use punched cards to represent different categories of information, such as age, gender, and occupation. He then invented a machine that could read the holes in the cards and count them using electrical circuits and mercury cups.

FF02: The success of Hollerith's tabulating machine in processing the US census data of 1890 saved the US government $5 million in the cost of conducting the census and reduced the time taken to process the data from 7 years for the 1880 census to just 1 year for the 1890 census. This was a significant milestone in the history of computing and data processing.

G?del's incompleteness theorems, 1931

Kurt F. G?del, an Austrian mathematician and philosopher, presents his proof of the incompleteness theorems to the mathematical community through a series of lectures and discussions in 1933 at the Institute for Advanced Study in Princeton, New Jersey, where he was a visiting scholar. The Institute provided a vibrant academic environment and a community of scholars for G?del to engage with and refine his ideas. The theorems demonstrate that no formal system of mathematics can be both consistent and complete, and have profound implications for the foundations of mathematics and logic. G?del's work has a significant impact on the development of computer science by showing that there are fundamental limits to what can be computed by machines.

A proof of Goedel's second incompleteness theorem in GL logic
A proof of Goedel's second incompleteness theorem in GL logic | Source: Wikimedia Commons

FUN FACTS

FF01: G?del was a close friend of Albert Einstein, and they often walked together at the Institute for Advanced Study in Princeton. Einstein once said that he came to the institute "just to have the privilege of walking home with G?del". They also shared an interest in philosophy and physics, and G?del discovered some surprising solutions to Einstein's equations of general relativity.

FF02: Upon the publication G?del was only 25 years old. He was a student at the University of G?ttingen in Germany, and he presented his results at a conference in K?nigsberg in 1930. His theorems shocked the global mathematical community.

Bush's differential analyzer, 1931

Vannevar Bush and his team complete their differential analyzer in May 1931 at the Massachusetts Institute of Technology (MIT). The differential analyzer is an analog computing machine that uses gears and shafts to solve differential equations, and is the first machine capable of solving a wide range of scientific and engineering problems. The differential analyzer is a key tool for scientists and engineers in the pre-digital era and lays the groundwork for modern scientific computing.

The original Cambridge differential analyser which was heavily inspired by Bush's differential analyzer
Cambridge differential analyser, one of the most important analog computers ever created, built by Hartree and Porter based on the design of Bush at MIT, exhibited at the University of Cambridge in the 1930s/1940s (unspecified year) | Source: Wikimedia Commons

FUN FACTS

FF01: The differential analyzer was used to calculate the trajectory of the first atomic bomb test in 1945, The Manhattan Project. Scientists needed to know how high the bomb should explode to maximize its destructive power. They used the differential analyzer at the University of Pennsylvania (Penn) to solve the complex equations that described the motion of the bomb and the shock wave it would produce. The machine took several hours to produce a single answer, but it was more accurate and reliable than any other method available at the time.

FF02: Bush's differential analyzer was the inspiration for the fictional "thinking engine" in the novel The Difference Engine by William Gibson and Bruce Sterling. The novel is set in an alternate history where Charles Babbage succeeded in building his analytical engine and sparked the Industrial Revolution in the 19th century.

Church's lambda calculus, 1936

Alonzo Church, an American mathematician and logician, introduces his lambda calculus in April 1936 at Princeton University, New Jersey, where he is a professor of mathematics. The lambda calculus is a formal system of mathematical logic that allows the definition and manipulation of functions, and has applications in programming language theory and computer science. Church's work provides a theoretical foundation for functional programming languages and helps establish computer science as a distinct field of study.

# The rule of β-Reduction in lambda calculus: 
# Replacing all occurrences of the param in the func' with its arg, when applying a func' (λx.M) to an arg N.

(λx.M)N → M[x:=N]        

FUN FACTS

FF01: Church's lambda calculus was originally developed as a mathematical tool for studying functions and their computability. It ended up being an inspiration for functional programming languages like Lisp, Scheme, and Haskell. In fact, many programmers credit lambda calculus as one of the foundations of modern computer science and programming language theory. So, next time you use functional programming, which is pretty common these days, you can thank Mr. Church!

FF02: Alonzo Church was a close friend and mentor of Alan Turing, the father of modern computer science. Church supervised Turing's doctoral studies at Princeton University in the 1930s, and they collaborated on several papers on logic and computability. They also independently proved the same result, now known as the Church-Turing thesis, which states that any function that can be computed by an algorithm can also be computed by a Turing machine or a lambda expression.

Turing's universal machine, 1936

Alan M. Turing, a English mathematician and computer scientist, introduces his universal machine in November 1936 at the University of Cambridge, England. The universal machine is a theoretical(!) device that can simulate any other computing machine, and demonstrates that any computable function can be computed by a machine. Later, it serves as the basis for all subsequent digital computers, which share the machine's basic scheme of an input/output device, memory, and CPU.

No alt text provided for this image
Replica of Manchester Mark I (which was an early implementation of a Turing Machine) at the Museum of Science and Industry in Manchester. Photo by Chris Burton (1998), digitally processed to resemble a 1940s photo | Source: The University of Manchester

FUN FACTS

FF01: In 2012, two researchers in the Netherlands built a Lego Turing machine using a single Lego Mindstorms NXT set and displayed it at the Centrum Wiskunde and Informatica as part of a Turing exhibition.

FF02: Turing's universal machine was never actually built by Turing himself, as it was no more than a theoretical concept. He only described it in mathematical terms and showed how it could perform any computation that is possible by any other Turing machine. However, later researchers have built physical models or software simulations of Turing's universal machine to demonstrate its functionality and significance.

Shannon's master's thesis, 1937

Claude E. Shannon, an American mathematician and electrical engineer, completes his master's thesis A Symbolic Analysis of Relay and Switching Circuits in August 1937 at the Massachusetts Institute of Technology (MIT). It demonstrates that Boolean algebra can be used to simplify and analyze complex electrical circuits, and lays the groundwork for digital circuit design. Shannon's work has a profound impact on the development of computer hardware and provides a formal framework for designing and analyzing digital circuits.

Any circuit is represented by a set of equations, the terms of the equations corresponding to the various relays and switches in the circuit.—Claude E. Shannon, A Symbolic Analysis of Relay and Switching Circuits, Master's Thesis, Massachusetts Institute of Technology, 1937

FUN FACTS

FF01: Claude Shannon was a distant cousin of Thomas Edison. Shannon admired Edison since he was a child and loved to tinker with mechanical devices. He later learned that they were both descendants of John Ogden, an early settler in New Jersey. Shannon followed his cousin's footsteps and became an inventor himself, creating many gadgets such as a flame-throwing trumpet and a rocket frisbee. He also invented information theory, which is the basis of modern communication.

FF02: Shannon's master's thesis is considered to be a landmark achievement in the field of electrical engineering and is widely recognized as one of the most important works of the 20th century.

Zuse's Z3, 1941

Konrad E. O. Zuse, a German civil engineer and computer scientist, completes his Z3 computer in May 1941 at his workshop in Berlin, Germany. The Z3 is the first programmable, general-purpose digital computer, and uses binary arithmetic and floating-point numbers to perform calculations. Although the Z3 is not widely known at this time, it is a significant achievement in the development of computing and lays the groundwork for the modern digital computer.

No alt text provided for this image
Replica of Zuse's Z3, arguably the world's first true computer, which was destroyed during WWII. The replica is on display at the Deutsches Museum in Munich, 1960 | Source: Konrad Zuse Internet Archive / Deutsches Museum / DFG (via ingenieur.de)

FUN FACTS

FF01: Despite being the world's first programmable digital computer, the Z3 got no love from the Nazis. They ignored it, underfunded it, and let it collect dust in a Berlin basement. It is only used by one engineer who had a thing for math and aviation. The Allies put an end to its misery by bombing it to smithereens in 1943.

FF02: Konrad Zuse also created a programming language called Plankalkül, which was based on his interest in computer chess. He developed it during WWII, but it remained a secret until 1972.

Atanasoff–Berry Computer (ABC), 1942

John V. Atanasoff and Clifford E. Berry, both American electrical engineers and physicists, complete their Atanasoff-Berry Computer (ABC) in June 1942 at Iowa State University (ISU) in Ames, Iowa. It uses electronic switches and binary arithmetic to perform high-speed calculations, and is the first computer to use electronic components, paving the way for modern digital computing. The machine's use of binary digits and its foundation for stored-program computers made it a significant milestone in computer history.

No alt text provided for this image
Atanasoff-Berry Computer, world’s 1st electronic digital computer, built by Iowa State physics professor John Atanasoff and electrical engineering graduate student Clifford Berry | Source: Iowa State University of Science and Technology

FUN FACTS

FF01: The ABC almost lost to history twice. The first time was when it was dismantled and discarded by Iowa State College in 1942, after Atanasoff left for World War II. The second time was when it was rediscovered in the 1960s, during a patent dispute over the ENIAC computer. A U.S. District Court ruled that Atanasoff was indeed the inventor of "the first electronic digital computer".?As a tribute, it was rebuilt in 1997 by a team of researchers at Iowa State University (ISU), about 55 years after its original launch.

FF02: The ABC was never patented or commercialized, and Atanasoff and Berry never made any money from their invention.

Flowers' Colossus Mark I, 1943

Tommy H. Flowers and his team complete the Colossus Mark I in February 1943 at the Government Code and Cypher School in Bletchley Park, England. The Colossus Mark I is the world's first programmable electronic computer, and is used to decipher messages encrypted by the German Lorenz cipher during World War II, significantly shortening the war. It is made up of around 2,400 vacuum tubes and can perform 5,000 operations per second, with the ability to store up to 1,500 characters of instructions and data. The completion of Colossus Mark I marks the beginning of the computer era, and its technology is used as the basis for modern computer architectures.

No alt text provided for this image
A rebuild model of the original Colossus machine (which was destroyed in the 1960s) at Bletchley Park | Source: Flickr

FUN FACTS

FF01: Tommy Flowers was a brilliant ambitious engineer who designed the Colossus Mark I at the Post Office Research Station in London, using his expertise in electronics and telephone exchanges. He bought the parts and components with his own money, spending about £ 1,000 of his own savings. He built the machine with his own hands, and delivered it to Bletchley Park in pieces. He persuaded his colleagues to help him transport and assemble the machine at Bletchley Park. Unfortunately, he did not receive a special award for his great contribution.

FF02: Colossus Mark I was so top secret during WWII that the British government went to great lengths to keep its existence hidden from both the Germans and the British public. To ensure that the Germans wouldn't discover the Colossus, the British referred to it as "the telephone system" in official communications. Additionally, even after the war ended and the Colossus was no longer needed for military purposes, its existence remained a secret for many years. It wasn't until the 1970s that the true story of the Colossus and its role in the war was declassified and made public.

Post's systems, 1943

Emil L. Post, an American mathematician and logician, introduces his system of recursive functions in April 1943 at the University of Illinois (UIUC). It formalizes the concept of an algorithm in a way that is still used today. The system, which is based on mathematical logic, allows for the creation of complex algorithms by building them up from simpler ones. Post's work is significant in the development of computer science, as it provides the foundation for the design of programming languages and software development.

# A simple example of a primitive recursive function (Post's systems):

f(x,y) = S(P_1^2(x,y))        

FUN FACTS

FF01: Post's systems are equivalent in power to Turing machines, which are widely considered as the most general model of computation possible. That means that anything that can be computed by a Turing machine can also be computed by a Post's system, and vice versa. However, Post's systems are much simpler and easier to understand than Turing machines, as they only use binary symbols, an infinite sequence of boxes, and five basic operations: mark, erase, move left, move right, and check.

FF02: Emil Post was the teacher of three influential logicians: Martin Davis, who solved Hilbert's tenth problem; Stephen Cole Kleene, who invented Kleene algebra; and John Barkley Rosser, who proved the Church-Rosser theorem.

Aiken's Harvard Mark I aka IBM ASCC, 1944

Howard H. Aiken and his team of engineers from IBM complete the Harvard Mark I aka IBM ASCC in August 1944 at Harvard University, Massachusetts. The Harvard Mark I is a large electromechanical computer that can perform complex calculations, and is over 50 feet long and 8 feet high, with 78 adding machines, 3000 switches, and a 50-foot-long paper tape for output. It can perform three additions or subtractions in a second and one multiplication or division every six seconds. Its primary purpose is to solve mathematical equations for the United States Navy, and its successful completion solidifies IBM as a leader in the field of computer technology.

No alt text provided for this image
ASCC aka Harvard Mark I, April 1948, Harvard's Cruft Laboratory, shows the computer about 4 years after its installation. At left are the machine's 60 dial switches, used for setting up known values in various computations | Source: IBM Archives

FUN FACTS

FF01: The term "computer bug" was inspired by an actual insect. In 1947, a moth got trapped in a relay of Harvard Mark II, the successor of Mark I, and caused a malfunction. The moth was removed and taped to the logbook with the note “First actual case of bug being found.”

FF02: The Harvard Mark I was programmed by a team of women who worked as 'human computers'. These employees were hired by Harvard University and IBM to perform mathematical calculations by hand.

von Neumann's stored-program concept, 1945

John von Neumann, a Hungarian-American mathematician and physicist, distributes his report on the EDVAC computer design in June 1945 at the Moore School of Electrical Engineering at the University of Pennsylvania (Penn), where he had been working on the project. The report outlines the basic architecture of the stored-program computer, which allows for the use of the same memory to store both data and instructions, enabling computers to perform complex tasks by running a series of instructions stored in memory. This innovation makes programming much more efficient and paves the way for the development of modern computers. The report also introduces the idea of a computer with a CPU and Control Unit, which are still fundamental components of computers today.

# A version of the Von Neumann architecture that uses the original terminology and
# descriptions from John von Neumann's paper "First Draft of a Report on the EDVAC" (1945):

| Main functional ? | ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 
| unit? ?          | Function ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?|
|-------------------|---------------------------------------------------------------|
| Central Control ? | Provides overall control of the operation of the EDVAC system |
| Arithmetic Unit ? | Performs arithmetic and logical operations on data ? ? ? ? ? ?|
| Memory Unit ? ? ? | Stores instructions and data for processing ? ? ? ? ? ? ? ? ? |
| Input-Output Unit | Transfers information between the EDVAC system and the user ? |        

FUN FACTS

FF01: John von Neumann was a child prodigy who mastered calculus by the age of six, and spoke multiple languages fluently. He also had a photographic memory and could memorize entire books. He was known for his quick wit and sense of humor, as well as his ability to solve complex problems in his head.

FF02: Von Neumann had many big ideas about computers and machines. He designed and built a self-replicating machine prototype and proposed a theoretical model of a machine that could build any other machine. His ideas influenced later researchers in fields like AI and nanotechnology, and he speculated about the possibility of machine learning and evolution.

Turing's code-breaking work, 1939–1945

Alan M. Turing plays a crucial role in breaking the German Enigma code during World War II, while he conducts his operations at Bletchley Park, a British codebreaking center located in Buckinghamshire, England. This leads to significant victories by the allies, which indirectly end the war. As part of his code-breaking work, Turing develops the Bombe, an electromechanical machine that can decipher messages encrypted by the German Enigma machine. His work during WWII marks the birth of modern computer science, as he develops many techniques and principles that are still used today in cybersecurity and cryptography. Together with his other contributions, he is widely recognized as the father of computer science.

No alt text provided for this image
Replica of the Bombe machine used to decrypt German Enigma messages during WWII, displayed at Bletchley Park | Source: Geograph Britain and Ireland

FUN FACTS

FF01: Alan Turing had some unusual habits. He once cycled to work wearing a gas mask to avoid hay fever, and he liked to time himself with an alarm clock tied around his waist.

FF02: Alan Turing was a world-class distance runner who almost qualified for the 1948 Olympics. He ran a marathon in 2 hours and 46 minutes in 1949, which was only 11 minutes slower than the Olympic gold medalist in 1948. He also regularly beat his fellow runners in races and training sessions, despite his unconventional style and lack of formal coaching.

U.S. Army's ENIAC, 1946

John W. Mauchly and J. Presper Eckert, both American electrical engineers, complete the construction of the Electronic Numerical Integrator and Computer (ENIAC) in February 1946 at the University of Pennsylvania (Penn) in Philadelphia. ENIAC is the world's first general-purpose electronic computer, designed to solve mathematical problems and used by the U.S. military to calculate artillery firing tables after WWII. Weighing 30 tons and using over 17,000 vacuum tubes, ENIAC can perform about 5,000 additions or subtractions per second. ENIAC's completion marks a significant step in the development of digital computing, paving the way for further innovation in the field.

No alt text provided for this image
ENIAC in BRL building 328 w/ Glen Beck (L) and Betty Holberton (R) | Source: U.S. Army / Public domain, from K. Kempf's "Historical Monograph: Electronic Computers Within the Ordnance Corps" (via Nursing Clio)

FUN FACTS

FF01: ENIAC was programmed by a team of six female mathematicians who were originally hired as "human computers" to calculate artillery firing tables. They had to learn how to operate the machine by studying its wiring diagrams and logic diagrams, as there was no formal training or documentation. They also developed techniques for debugging and optimizing the programs, such as using subroutines and nested loops.

FF02: ENIAC was so large and power-hungry that it caused brownouts in Philadelphia when it was switched on. It occupied about 1,800 square feet (167 square meters) of floor space and weighed about 30 tons (27 metric tons). It used about 18,000 vacuum tubes and consumed about 150 kilowatts of electricity. It also generated a lot of heat and noise, requiring a special cooling system and ear protection for the operators.

Bell Labs transistor, 1947

William B. Shockley, John Bardeen, and Walter H. Brattain, all American physicists at Bell Labs, demonstrate the first working transistor in December 1947 in Murray Hill, New Jersey. The transistor is a more reliable and efficient switch for electronic circuits compared to vacuum tubes, which were bulky, fragile, and consumed a lot of power. The invention of the transistor is a major breakthrough in electronics and computing, enabling the miniaturization of electronic devices and paving the way for the development of modern computers.

No alt text provided for this image
The original unit of the first working transistor, developed by Bardeen, Brattain and Shockley at Bell Labs in 1947, exhibited at Bell Labs | Source: Wikimedia Commons

FUN FACTS

FF01: The transistor was invented by accident during an investigation of the surface properties of germanium, and?John Bardeen and Walter Brattain ended up creating a completely new technology that revolutionized electronics.

FF02: The transistor was kept secret for six months after its invention, until Bell Labs filed a patent application on June 17, 1948. The patent was granted on October 3, 1950, and assigned to Bell Telephone Laboratories.

Shannon's information theory, 1948

Claude E. Shannon, an American electrical engineer and mathematician, submits A Mathematical Theory of Communication to the Bell System Technical Journal in July 1948, which is then published by AT&T. The paper introduces the concept of entropy in information theory and establishes the importance of binary code in the digital age, which lays the foundation for modern digital communication and makes possible the development of technologies such as the internet and mobile phones.

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.—Claude E. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal, 1948

FUN FACTS

FF01: Shannon's information theory was inspired by his work on cryptography and code-breaking during WWII. He realized that the concepts of information and uncertainty could be mathematically defined and measured, and that they were related to the probability of different outcomes of a random variable.

FF02: Shannon's information theory was influenced by his hobby of building mechanical devices, such as a juggling machine and a chess-playing machine. He also built an electromechanical mouse named Theseus that could learn to navigate a maze.

University of Manchester's Mark I, 1949

Frederic C. Williams and Tom Kilburn, both English electrical engineers, run the first program of the Manchester Mark I aka Manchester Baby in June 1949 at the University of Manchester in England. The Manchester Mark I, designed and built by Williams and Kilburn, is the world's first stored-program computer, which means that both data and instructions can be stored in the same memory and manipulated by the same CPU. This allows for more flexible and versatile computing than previous machines, which require manual rewiring for each new program. The Manchester Mark I is a landmark achievement in the history of computing, paving the way for the development of modern computers.

No alt text provided for this image
Working replica of the Baby, the world's first stored-program electronic computer, on display at the Museum of Science and Industry in Manchester. It was Built to celebrate the 50th anniversary of the running of its first program. Demos of the machine in operation are held regularly at the museum | Source: Wikimedia Commons

FUN FACTS

FF01: The Mark I was the first computer to run a program that generated musical notes. In 1948, Christopher Strachey wrote a program that played "God Save the King" on the machine's loudspeaker. The program used random numbers to vary the tempo and pitch of the notes.

FF02: The first chess program was written by Dr. Dietrich Prinz for the Ferranti Mark 1, which was a commercialized version of the Manchester Mark I. The program could only solve mate-in-two problems and was demonstrated in November 1951 at Manchester University.

Turing's imitation game, 1950

Alan M. Turing, an English mathematician and computer scientist, publishes his Imitation Game via the 30-page article Computing Machinery and Intelligence in October 1950 in the journal Mind. In this substantial article, Turing proposes what is now known as the Turing Test, a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. The paper also discusses the philosophical implications of artificial intelligence and explores the idea of machines being able to think and learn like humans. Turing's work has significant implications for artificial intelligence, as it provides a benchmark for evaluating machine intelligence and challenges the idea that human-like intelligence is unique to biological organisms.

No alt text provided for this image
Evan R. Wood as Dolores, the protagonist/antagonist as well a host (artificial being) in HBO's sci-fi series 'Westworld' | Source: John P. Johnson, HBO

FUN FACTS

FF01: The Imitation Game is a critically acclaimed biographical film about the life of Alan Turing, released in late 2014, and starring Benedict Cumberbatch as Alan Turing, and Keira Knightley as Joan Clarke. The film explores Turing's personal and professional struggles, including his tragic fate as a gay man in a time when homosexuality was illegal. It has received a 90% rating on Rotten Tomatoes (based on 287 professional reviews) and a score of 71 on Metacritic (based on 49 professional reviews).

FF02: The name "Turing Test" was not coined by Turing himself, but by a computer scientist named Marvin Minsky in the 1960s. Turing's original paper referred to it as the "imitation game," and he did not use the term "Turing Test" in the paper. However, the name has stuck and is now widely used to refer to the test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human.

Almost made the podium

IBM 285 Tabulator, IBM's engineers, Jul 1933, USA:

  • It used binary digits to represent decimal numbers and introduced the concept of binary arithmetic.
  • It is a precursor to the IBM 1401, one of the first commercially successful electronic computers.
  • It is an early electromechanical machine for arithmetic on punched cards.

Complex Number Calculator, George R. Stibitz, Oct 1939, Dartmouth College in Hanover, NH:

  • It is the first electromechanical computer for arithmetic on complex numbers.
  • It used bits for decimal numbers and showed binary arithmetic’s feasibility.
  • It is the first remote computing via teleprinter and phone.

Radar tech, a team of scientists and engineers from various countries, 1935 1945, unspecified location:

  • It stimulated the development of digital technology and signal processing for radar applications.
  • It spawned the cavity magnetron, enabling microwave radar and communication.
  • It inspired the use of binary arithmetic and logic circuits for radar data analysis and computation.

The dawn of the programming languages, 1950s

The Manchester Mark I, developed in the late 1940s, is a pioneering computer that paves the way for the development of assembly languages. Meanwhile, the ENIAC, which emerges in the mid-1940s, is one of the first general-purpose electronic computers and lays the groundwork for high-level programming languages. These two machines, each with their unique strengths and limitations, set the stage for the future of programming.

The ENIAC coding system is developed by John von Neumann and Herman H. Goldstine in the mid-1940s at the Moore School of Electrical Engineering. It is tailored specifically for ENIAC, hence its name, and is first used in 1945 to calculate artillery firing tables and test the feasibility of the thermonuclear weapon. The ENIAC coding system made ENIAC the first programmable digital computer, and influenced the development of Short Code, which is the first high-level programming language for electronic computers, as well as inspired Fortran, which is the godmother of high-level programming.

! Fortran: Hello, World! program

program hello
  print *, "Hello, World!"
end program hello        

The Mark I Autocode language is developed by Tony Brooker and his team in the early 1950s at the University of Manchester. It is tailored specifically for Manchester Mark I, hence its name, and is first used in 1952 to write a program for calculating the divisor of two numbers, which is the first significant use of a programming language. Autocode is an assembly language that made programming the Manchester Mark I much easier and faster. It is a breakthrough that allowed for more complex programs to be developed on the machine. Assembly languages were then widely adopted and used in early computers, and eventually were used to create other languages, as well as compilers. Assembly languages are still used today, which makes them the longest running language family to date.

LOOP, TXI OUT,1,
      TXI LOOP,0,1
OUT,  B    0,0,0,41,24,32
      B    0,0,0,4,12,01        

Of course, it's worth noting that the Manchester Mark I and the ENIAC would not have been possible without the many innovations/inventions that came before them.

Fun bonus

(MORE) FUN FACTS

FF01: While people today associate the term "computer" with smart machines, it originally referred to a person who performed calculations (rather than a machine). In fact, the term derives from the Latin verb "computare", meaning "to count". Plainly, computing has been an essential human activity for centuries, and the machines we use today are simply the latest tools in a long tradition of numerical problem-solving.

FF02: In the first half of the 20th century, "Human computers" were individuals, often women, with strong backgrounds in mathematics and science, who performed repetitive and time-consuming calculations by hand. This was an official job title used in government agencies, universities, and research institutions, and the same individuals could actually say "I am a human computer" to describe their role. They worked for essential organizations including the U.S. Army, NASA, Princeton University, and Bell Labs; some of them even contributed to important missions, such as launching the first American into space and landing humans on the moon.

FF03: The first computer programmer is generally accepted to be Ada Lovelace, a gifted mathematician who lived in London in the 19th century. She is known for her brilliant collaborative work on Babbage's Analytical Engine, a device which was never actually completed. Lovelace devised a set of mathematical rules for the machine to follow, which was a groundbreaking achievement. This is considered the world's first machine-centric algorithm, and it earned her the above title, as well as recognition as a notable pioneer of computer science. Unfortunately, the analytical engine was never built, and her notes remained theoretical.

FF04: Before computer science was established as an academic discipline, most programmers were women. However, as computer science became more male-dominated, the role of women in computer science decreased significantly. In fact, the percentage of women in that field decreased significantly, from 45–55% in the 1960s to just 30–40% by the mid-1980s. Since the early 1990s, the percentage has remained below 30%, and today (the 2020s) many companies are making an effort to achieve a more balanced gender representation, similar to the pre-1980s era.

FUN QUOTES

“The real problem is not whether machines think, but whether men do.”—Burrhus F. Skinner, an American psychologist and behaviorist, from his book Contingencies of Reinforcement: A Theoretical Analysis, published in 1969.

“The most important thing in the programming language is the name. A language will not succeed without a good name. I have recently invented a very good name and now I am looking for a suitable language.”—Donald E. Knuth, an American computer scientist and mathematician, from his article Computer Programming as an Art, published in Communications of the ACM (monthly journal) in 1974.

“Computers are incredibly fast, accurate, and stupid. Human beings are incredibly slow, inaccurate, and brilliant. Together they are powerful beyond imagination.”—Leo M. Cherne, an American economist and public servant, from his speech at the International Conference on Computer Communication in 1972.

“We shall now consider how we can design a very simple machine that will think.”—Edmund C. Berkeley, an American computer scientist and co-founder of ACM, from his book Giant Brains or Machines That Think, published in 1949.

“The best way to predict the future is to create it.”—Peter F. Drucker, an Austrian-American management consultant and educator, from his book Management: Tasks, Responsibilities, Practices, published in 1973.

“The ultimate goal of computing is to solve problems using machines which have been programmed by other machines.”—John von Neumann, a Hungarian-American mathematician and a pioneer of computer science, from his article The General and Logical Theory of Automata, published in Cerebral Mechanisms in Behavior: The Hixon Symposium in 1951.

“Computer science is no more about computers than astronomy is about telescopes.”—Edsger W. Dijkstra, a Dutch software engineer and programming pioneer, from his book The Discipline of Programming, published in 1976.

“A computer would deserve to be called intelligent if it could deceive a human into believing that it was human.”—Alan M. Turing, an English mathematician and a pioneer of computer science, from his paper Computing Machinery and Intelligence, published in Mind in 1950.

Outro

  • Fascinatingly, all of these developments are interconnected, with each breakthrough leading to another and another, creating a domino effect that has transformed our world.
  • Admiringly, the innovators of this era were truly extreme, pushing the boundaries of what was thought possible and paving the way for future advancements that we continue to benefit from today.
  • Charmingly, women played a prominent role in the field of computer science during this time, but their representation has decreased since then, making it all the more important to recognize their contributions and support gender diversity in tech.
  • Curiously, computer scientists back then were essentially mathematicians working with monstrous machines to solve complex problems, with the focus on the math rather than the technology itself.
  • Amusingly, these early computers took up entire rooms and required large teams to operate, highlighting the significant resources needed to make technology work at that time.
  • Oddly, most of these computers were programmed using physical controls, as there were no high-level programming languages back then (first half of the 20th century), and programming methods were much more costly and exhausting compared to today's programming methods.
  • Seriously, without the code-breaking work during WWII, we may have been living in a very different world today, with a dark alternate history that the TV series "The Man in the High Castle" offers a glimpse of.

No alt text provided for this image
Langley's human "computers" at work in 1947 | Source: NASA (via PICRYL)

Miraculously, we've gone from printed math to punch cards to mainframes to smartphones. And yet, as much as things change, some things remain the same. We still face coding challenges, we still spend hours debugging, and we still get that rush of satisfaction when our code finally works.?

And as for me, well, I'll be over here developing (software) and sharing my thought-provoking observations. Thank you for reading, my coding comrade.

#ComputerScience??#Programming?#History?#Innovation?#Technology

Anand Sivaraman

Bringing Specialist Care Closer to the Patient with AI and Point of Care Devices | Leadership | Prevention and Monitoring in Primary Care and at Patient Homes

1 个月

@

回复
Anand Sivaraman

Bringing Specialist Care Closer to the Patient with AI and Point of Care Devices | Leadership | Prevention and Monitoring in Primary Care and at Patient Homes

1 个月

@

回复
Anand Sivaraman

Bringing Specialist Care Closer to the Patient with AI and Point of Care Devices | Leadership | Prevention and Monitoring in Primary Care and at Patient Homes

1 个月

@

回复
Anand Sivaraman

Bringing Specialist Care Closer to the Patient with AI and Point of Care Devices | Leadership | Prevention and Monitoring in Primary Care and at Patient Homes

1 个月

@

回复
Alvean Fentener

recruiting and team building for GFA Consulting Group

10 个月

Just stumbled on your first article and thank you for it. Very <entertaining> to see how we've come a long way. “A computer would deserve to be called intelligent if it could deceive a human into believing that it was human.”—Alan M. Turing, Almost there?

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了