Entropy vs. Knowledge: You Do the Math
Entropy is a measure of how much a system is disordered. In Physics, this term is usually applied to thermodynamic systems. Such a system is made up of many particles that are individually governed by their own rules, but collectively determine behavior of the system as a whole. Physicists quantify entropy via temperature which is a proxy for average velocity of particles in a system. Given their nature of motion, the faster the particles move, the less ordered the system is.
Higher temperature → Faster particles → Less order
Fundamentally, entropy in a closed system can never decrease. This is captured in the Second Law of Thermodynamics (SLT). This Law leads to the fact that anything that does useful work will necessarily incur energy overhead to overcome frictions and resistances. That overhead is not redeemable. If one pushes a table several feet across the room, the floor will consume some of the effort due to friction. Paths along which the legs were dragging will become slightly warmer. The one who had been pushing the table overspent one’s energy because the goal was to move the table, not heat up the floor. Yet, there is no way to get that energy back. One cannot cool the floor and make the table move, even though this would not contradict any physical laws – except for the SLT.
What’s been cooked cannot be un-cooked.
Surprisingly, the ubiquitous tendency of systems to increase entropy did not get in the way of evolution. From atoms to molecules, from simpler compounds to more complicated ones, from bacteria to organisms, from tribes to societies – there has been an undeniable evidence of order. Of course, this is somewhat delusional since we only observe the current “run”, but there may have been a myriad of other less successful “runs”, which we cannot observe for we do not exist in any other “runs”. Still, of the vast ocean of all available options only a tiny drop of possibilities can unlock the quest for the next drop in the next ocean. It is impossible to somehow jump from atoms to bacteria, skipping molecules. Then, once there are bacteria, it is impossible to skip straight to societies. Evolution embeds order in dependence of subsequent states on only a narrow range of preceding states.
99% stop; 1% continue 99% stop; 1% continue To be here, you had to beat 1-in-100 odds. Twice.
Evolution is then fueled by the force of uncertainty reduction. There is not much interesting about a mere multitude of atoms. Such a system inevitably reaches high entropy – a really boring state where atoms randomly scatter around the space to which they are confined. All atoms in this system would be deprived of a chance to form something that would someday acknowledge their input. Complete freedom, maximum uncertainty, pure chaos. At the same time, if this boring system is excited to a new state with at least one molecule, an infinite number of states of pure chaos would be eliminated: The atoms in a molecule now have to move orderly together rather than completely random individually. This new state reduces uncertainty about the next state because it imposes additional structure on the system. The system, of course, is free to oscillate between states with similar levels of uncertainty and even relax some of its structure. However, in order to evolve, a system must necessarily reach more complicated states, reducing uncertainty through more demanding structure.
Evolved systems are aware but fragile.
By essence, knowledge is uncertainty reduction. In its broadest sense, it is not limited to humans. A robot may learn how to avoid hitting walls in a room. A cat may learn when its hosts return from work so it can be out in the kitchen at the right time. A bacterium may learn how it can combat antibiotics. A system of disparate atoms may learn its more stable molecular configuration. Contrary to daily use of the word, knowledge is not a privilege of conscious (or conscious-like) beings. With that, learning is a prerequisite to evolution. Only very specific prior knowledge unlocks access to subsequent knowledge, which will also have to be very specific for a system to carry on learning, carry on evolution. Learning is an inherent feature of systems simply because absence of knowledge makes random acquisition of knowledge unavoidable given enough time. Further, learning is generally more likely than un-learning because the latter requires the former to occur in the first place.
Repulsion is destructive and conditional. Attraction is constructive and universal.
Systems exhibiting progression to more demanding structures is not a contradiction to the SLT! In obtaining knowledge they necessarily consume more than they internalize. However, entropy that dooms systems to chaos has a worthy antipode of knowledge that directs systems to evolution. Ultimately, the proof – or bust – of this claim is no further than the answer to the question:
Does a system at its current temperature hold the same amount of entropy as the same system at the same temperature but with one asking this question?