What is information?

What is information?

Part 1

As I dive more into the quantum realm, a question that never ceases to occupy my mind is the one that serves as the title for this article: What is information?

First, let's assume (of course, there is always an assumption) that language is the first implicit agreement that we make.

Generally, I like to search for answers in mythological landscapes. It's not scientific at all, but it amuses me (again, not scientific).

The Bhagavad Gita, a 700-verse Hindu scripture that is part of the Indian epic Mahabharata, primarily focuses on spiritual and philosophical teachings, the nature of the self (atman), and the path to spiritual realization. It addresses fundamental questions about life, duty (dharma), morality, and the nature of reality.

Krishna instructs Arjuna. Credits: krishnatemple.com

However, the Bhagavad Gita does contain profound philosophical and metaphysical teachings that touch upon the nature of existence and consciousness. It explores concepts related to the eternal soul (atman), the transient nature of the physical body, and the interconnectedness of all beings. These teachings are more aligned with ancient Indian philosophies, particularly those of Vedanta and Samkhya. While the Bhagavad Gita may not directly address the contemporary concept of information as it is understood in modern science and technology, it provides valuable insights into the nature of reality, consciousness, and the self that have philosophical and metaphysical implications.

The concept that "information is physical" has deep implications, bridging the realms of information theory, physics, and computation. I would like you to explore with me this profound idea by drawing on the contributions and insights of several key figures in the fields of mathematics, computer science, and physics, including both male and female pioneers who have left a lasting impact like Ada Lovelace, George Boole, Grace Hopper, Claude Shannon, John von Neumann, Alan Turing, Kurt G?del, Barbara Liskov, Donna Strickland, Shafi Goldwasser, Fei-Fei Li, Cynthia Breazeal, Jennifer Tour Chayes, Yuri Manin, Charles Bennett, Richard Feynman, Edward Fredkin and Rolf Landauer.


If you consider this that someone is really important to add to the list, drop that name in comments. I would like to know more about these key people. Note that these individuals made contributions over a span of several decades, and their work continues to influence various fields of science and technology.


Let's dive into some of these personalities and their key contributions:

George Boole: The Algebraic Nature of Information

George Boole's work in symbolic logic and Boolean algebra introduced the idea that logical operations and information processing could be expressed through mathematical equations. By using binary digits (0's and 1's), Boole established a connection between logical statements and algebraic expressions. This bridged the gap between abstract concepts and tangible mathematical structures, demonstrating that information manipulation had a concrete, algebraic foundation.


Boole. El creador del álgebra de la lógica. RBA

Boole's algebraic approach became instrumental in the design of digital circuits and computer architecture. It demonstrated that even the most abstract concepts of information and logic could be represented and processed using physical components. His masterpiece The Laws of Thought founded the discipline of algebraic logic.


George Boole. Credits: The independent

“That language is an instrument of human reason, and not merely a medium for the expression of thought, is a truth generally admitted.” George Boole - The Laws of Thought


Claude Shannon: Quantifying Information

Claude Shannon's contributions to information theory went beyond the mere representation of information. He quantified information in a precise and measurable way. Shannon's groundbreaking work introduced the concept of entropy, borrowed from thermodynamics, to describe the uncertainty or randomness in information. He demonstrated that information had a physical aspect, as it could be quantified in terms of bits and related to probabilistic events.

Shannon's entropy concept not only revolutionized communication systems but also highlighted the connection between information and physical phenomena like noise and data transmission. It became evident that information processing involved real-world considerations such as signal degradation and energy consumption. Wait! Did I write energy consumption? What is happening here?


Claude Shannon. Credits: Wikipedia


"Information is the resolution of uncertainty." Claude Shannon


John von Neumann: Computational Processes as Physical Processes

John von Neumann's contributions to computer science extended the understanding of computation as a physical process. He introduced the idea of a stored-program computer, where instructions and data were stored in the same memory, enabling the computer to manipulate its program. This concept emphasized that computations were not merely abstract operations but were carried out by the physical components of the computer.

John von Neumann. Credits: Wikipedia

Von Neumann's architecture showcased the intimate connection between the physical hardware and the execution of algorithms. It made it clear that even the most complex calculations were ultimately rooted in the behavior of physical systems, from transistors to memory cells. Further, von Neumann is associated to an entropy concept within quantum information, which is an extension of Gibbs Entropy. Entanglement measures are based upon some quantity directly related to the von Neumann entropy [3].


Credits: Sam Walters Twitter/X

“It is only proper to realize that language is largely a historical accident.” - John von Neumann, The Computer and the Brain


Alan Turing: Algorithmic Universality

Alan Turing's work on the Turing machine demonstrated that computation could be reduced to a simple, abstract model. His theoretical machine, capable of performing any conceivable computation, emphasized that all computational processes could be broken down into a series of discrete, mechanical steps. This universality of computation reinforced the idea that information processing had a concrete, algorithmic basis.


Alan Turing. Credits: Britannica

Turing's work highlighted that information manipulation was not just a mathematical abstraction but a process that could be mechanized and realized physically. This insight paved the way for the development of modern computers and underscored the connection between computation and the physical execution of algorithms.

“The original question, 'Can machines think?' I believe to be too meaningless to deserve discussion.” ― Alan Turing, Mechanical Intelligence: Collected Works of A.M. Turing


Kurt G?del: Limits of Formal Systems

Kurt G?del's incompleteness theorems had a profound impact on the understanding of formal mathematical systems. His theorems showed that there were limits to what could be proven within such systems, revealing the inherent limitations of information processing in purely formal contexts. In other words, G?del's work suggested that there were mathematical truths that could not be derived through computational means alone.

G?del's theorems highlighted that information processing, even in the context of rigorous formal systems, had fundamental boundaries. This concept challenged the notion that information processing could be entirely divorced from the physical limitations of the systems in which it occurred.


Credits: Princeton Advanced Studies Institute

Either mathematics is too big for the human mind or the human mind is more than a machine.


Richard Feynman: Quantum Computing and Information

Richard Feynman's exploration of quantum computing opened up a new frontier in understanding the physical nature of information. He recognized that quantum mechanics offered a radically different way to process information. In the quantum realm, particles could exist in superposition of states, enabling the potential for quantum computers to perform certain calculations exponentially faster than classical computers.

Feynman's insights suggested that information processing wasn't limited to classical binary systems but extended into the quantum realm, where the physical properties of particles played a critical role in computation. This further emphasized the deep connection between information and the physical world, even at the quantum scale.


Feynman at FermiLab - Credits: Wikipedia

Nature isn’t classical, dammit, and if you want to make a simulation of Nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem because it doesn’t look so easy - Richard Feynman (1981)


Yuri Manin: Arithmetic Information Theory

Yuri Manin's work in arithmetic information theory delves into the fundamental relationship between arithmetic, algebraic geometry, and information processing. Manin's research explores how mathematical structures encode and manipulate information. This connection between abstract mathematical concepts and the physical encoding of information highlights that even in pure mathematics, information is an integral part of the underlying structures.

Manin's work serves as a reminder that the physical nature of information extends beyond the realm of classical computing and quantum mechanics. It underscores the presence of information-related concepts in the very fabric of mathematical structures and how they interact with the physical world.


Yuri Manin At the Conference on Differential Geometry and Global Analysis in Garwitz, Germany, in 1981. (Photo courtesy of Archives of the Mathematisches Forschungscinstitut Oberwolfach)

Most likely, logic is capable of justifying mathematics to no greater extent than biology is capable of justifying life. Yuri Manin


Charles Bennet: Quantum Cryptography and Information

Charles Bennett, a physicist and computer scientist, made significant contributions to quantum information theory. His work on quantum cryptography and quantum computing emphasized the physical nature of information in quantum systems. Bennett's introduction of quantum key distribution demonstrated that information security could be tied to the fundamental principles of quantum mechanics, further deepening the understanding of how information interacts with the physical world.


Charles Bennett - Credits: Wikipedia

A true scientist appreciates being proven wrong - Charles Bennett


The insights of these key figures collectively underscore the deep and intricate relationship between information and the physical world. From algebraic foundations to quantum phenomena, the limits of formal systems, and the potential digital nature of the universe itself, these thinkers have illuminated the notion that information is not an abstract concept but a tangible, physical entity deeply rooted in the fabric of the universe. The exploration of this concept continues to shape our understanding of the world and our place within it.

To be continued...

If you consider this that someone is really important to add to the list, drop that name in comments. Let's debate.



要查看或添加评论,请登录

社区洞察

其他会员也浏览了