Black Swans, Thermalization, Chaos, Uncertainty
Image credit: https://sci.umanitoba.ca/physics-astronomy/can-a-physical-system-avoid-thermalization/

Black Swans, Thermalization, Chaos, Uncertainty

Terminology

Physical systems form ensembles when bonding occurs. Small ensembles form bigger parts, then the bigger engage in the next-level interactions and form larger structures, and so on. In this article, by the word "scale" I mean a "structural position" across these levels. By "level" I mean almost exactly the same as scale, but in a more "pointer" sense (scale, on the other hand, encompasses all elements of a given level).

By "element" I mean a bonded group of elements from the smaller scale. All elements belonging to a given scale have a finite range of sizes (and distances between them).

Thermalization is a statistically equipartite distribution of energy between elements of a given scale. When we study processes of thermalization, scales above and below can be safely ignored for the purpose of modeling.

Statistical distribution assumes thermalization. Probabilities can't be applied to systems without ability to reach thermal equilibrium. For any statistical model to work, the ergodic hypothesis must hold (otherwise your measure theory breaks the very concepts of probability and outcome), and for a programmer it's a nonsense: each state in ergodic hypothesis assumed equally reachable over a long period of system's evolution! Statisticians think that machine learning works because ergodic hypothesis holds. I state the opposite: ML works because of the periodic behavior of all systems. They apply statistics to the things where computational models will behave much, much better, more precise, and run faster, than any recurrent neural network can afford.

Recurrence sequences are sequences of states any given scale goes through in its dynamic evolution without any special "resonant disturbances". The famous Fermi–Pasta–Ulam–Tsingou problem showed that all systems, when only one level of scale considered, have a periodic behavior. Technically, I'm lying here, because recurrence sequences only occur in closed systems. But in order to understand the world around us, this model is required: all computer programs can be set to run on finite domains (input arguments), without interactivity, so closed systems in the sense of non-interacting levels of scale is a reasonable abstraction, in my opinion.

Resonant disturbance (bifurcation point, in chaos theory) is a special condition, allowing interaction between six causality waves to happen in the same region, where any non-zero (even a very weak suffices) coupling does exist between the six waves. In physics, quantum fields are the medium to propagate those waves. Because we have multiple coupled fields (and we seem to not know all 6 yet - EM, weak, Higgs, gluons, gravity - I'm definitely missing one mysterious "dark energy" here to "make the system work"), this is the core effect which makes thermalization happen in the smallest, "quantum", scale, giving rise to all Alice and Bob teleportation fun, and to quantum fluctuations. You can read this great paper to understand "why 6": https://www.pnas.org/content/pnas/112/14/4208.full.pdf

Chaos is a significant divergence of system states from its original recurrence sequence to another recurrence sequence (initially unknown to observers), without thermalization. Many systems depend heavily on initial conditions in a way that they evolve into very different recurrence sequences. People misapply the "butterfly effect" to chaos, then wonder why it doesn't work, because the main point to pay attention to is what variables those initial conditions do describe. They belong to elements of only one level of scale. The butterfly is on another level than the clouds and the Sun. The Black Swan of Nassim Nicholas Taleb is this non-thermalizing impact from another level of scale.

The Two Important "Paradoxes" I See

To occur, thermalization requires a locally compact region, because of the finite sizes and distances between typical elements of one scale level. The finite sizes and distances define a finite volume, a compact region. Thermalization never happens across levels, complex multi-scale systems always follow recurrence sequences. But on the other hand, thermalization does require the law of big numbers, existence of very many elements all having very similar sizes. On a side note, it constraints the smallest volume where thermalization can occur. This is a surprising second emergent cause of scale (beyond the well-known structurally emergent, like bolts and nuts, as opposed to the third, non-emergent cause of scale, like gravity vs. EM).

Another surprise, is that the origin of thermalization and chaos is the same: a resonant impact from another level of scale, typically from the level above, when there's enough ensemble kinetic energy (like a projectile), or from a compact level below, where a significant stored energy (chemical, nuclear, spring, spinning) unfolds in a resonant way, at the right time and place, to interact with another level.

What do you think about complexity, emergence, statistics, certainty, probabilities, chaos, computational models?

Brian Greenforest

Chiplets LLM CV SDR Massively Parallel Distributed Compute Nonlinearity Causality Consciousness

3 年

Mark Jackson thought you might find this interesting :-)

回复
Brian Greenforest

Chiplets LLM CV SDR Massively Parallel Distributed Compute Nonlinearity Causality Consciousness

3 年

Thad Roberts and Wyatt Flanders you guys are the best people I know who understand the inter-level complexity interactions, and who study and research (in depth!) fluid dynamics and emergence. Not to mean "spamming" you, but to open a discussion on this very interesting topic, of predictability, complexity, computability. Especially in the age of big data, where neural networks run very inefficient recurrence sequence detection, instead of running algorithmic simulations of a much larger scale, given the same amount of compute resources available. In attempt to mimic the human brain, not everything should be duplicated, in my opinion. Algorithms and statistical models complement each other, like Yin and Yang.

要查看或添加评论,请登录

Brian Greenforest的更多文章

  • A Distributed Programming Language is Possible

    A Distributed Programming Language is Possible

    A distributed programming language is a surprisingly simple idea. Imagine you have an int main() {} function.

  • Self-modifying Spacetime

    Self-modifying Spacetime

    I'm a computer scientist working on some self-modifying code. Came up with these ideas about God, free will, and…

  • A Weekend's Fall of Thought

    A Weekend's Fall of Thought

    Leaves turned orange, but an apple still was there. Bill and Teddy claimed a park bench under the tree.

  • How machines learn what we need?

    How machines learn what we need?

    The hardest part of every business is to know exactly what your client wants. Being an entrepreneur, I see machines…

社区洞察

其他会员也浏览了