Infinite growth in infinite complexity
Ivan Voras
PhD comp.eng. / passionate for R&D / large-scale local AI (LLMs) for data pipelines / Host, Surove Strasti podcast / CTO for hire
As a young engineer, I had a comfortable bubble of friends and colleagues in which we collectively scoffed at the economists telling the story of growth without addressing the issue of the endgame - "when does it stop?" I assume many of you have encountered the idea of "infinite growth" from one point of view or the other.
What I'm writing here is not rocket science, but a way of getting a thing off my mind and maybe as a small apology to the budding economists I may or may not have insulted at that time :) In retrospect, the article meanders a bit, but if you stick with it, it actually gets to a point.
So, sure, obviously the planet we live in is finite, and if we as a species live to see it, the universe is probably too (although so big it doesn't matter). Just extracting stuff from the environment, using it, and discarding the waste product makes us not that much different than amoebas, and will soon-ish get us into a dead end. We have kind of started to step beyond what amoebas are doing by recycling but ultimately it doesn't matter - the physical world is limited.
But you know what is significantly less limited? The number of ways (combinations) in which we can arrange a piece of that physical world.
A Brief Intro to the Combinatorial Explosion
The question is: in how many was can we arrange (or "combine") N distinct pieces of something? There are usually some constraints involved so that such arrangements "make sense" on some level, but taken at face value, the answer is a factorial of N, and that's a function which very quickly grows beyond human comprehension. A factorial of 100 (written as "100!" in mathematical notation) is approximately 9.33 times 10 to the power of 157, a "number so large that it cannot be displayed on most calculators, and vastly larger than the estimated number of fundamental particles in the observable universe".
Luckily, there are usually "it must make sense" constraints to the actual arrangements, so the number is usually lower. You've probably seen that, for example, in communication and organisational theory, we can ask how many lines of communication can be established between N entities (either people or groups or organisations, etc.), and the answer is
That function grows much slower than the factorial, and for N=100, the result is 4950. It's also one of the ways projects die, since if everyone talks to everyone, the number of interpretations of ideas and goals multiplies beyond our practical ability to handle it. It's the reason why we set up hub-and-spoke arrangements with leaders / managers filtering and routing information, and why back-channel communication can harm projects.
On the other hand, combinatorial explosion helps with data encryption - practically all relevant encryption methods use hash functions in some way, but that's a more complicated topic for another article.
Complexity and Creativity
Imagine a digital image of 100x100 pixels. It's obviously a very small image. But that's 10,000 pixels, each of which can (usually) contain one of 16,777,216 colors. This means there are 167,772,160,000 unique images that can exist with those dimensions, but only some of them "make sense."
It's hard to say how many 100x100 images that "make sense" exist, especially since a lot of them will just be small variations of others. Given a Mona Lisa image in 100x100 pixels, no-one will notice if a single pixel in one of her eyes is slightly darker than the original. Or even if the whole image becomes 0.1% more red.
领英推荐
To store all of those unique images in their raw form, we would need 5,033,164,800,000,000 bytes of storage, or approximately 5 petabytes. Even with today's hardware, that's a lot, and that's for a 100x100 image! To store all possible unique 1000x1000px images, it would take 150,994 petabytes. That's more than 40x the amount of data created on the whole planet per day (approximately 3,500 petabytes). Noone is going to store that data any time soon.
I want to draw attention to these facts (getting back to the 100x100 image):
There is absolutely no need to store all of those images, since we can request e.g. "image #772,160,000" and it will be generated. BUT, it will take some time to generate it, so we're trading space efficiency for time efficiency. I'm sure Einstein would have been fascinated with the concept.
What's preventing us from using this method of referring to images is that this single number doesn't tell us anything about what's in the image. We'd need a database which says something like: "Image #76545: a blue frog sitting on a yellow maple leaf which has fallen from onto a patch of grass, and to the left of it is an orange and to the right of it is the photo of Marilyn Manson from 1999."
I want to draw attention to the following:
Where does that leave us?
The Point
The final frontier might not be space, and probably isn't time. Complexity is about what we can do with finite resources, and that is the domain of growth in the digital age. It's not about how many computers can we build, or the size of the industrial base (as in "hard", material industries created by the previous industrial revolution), but about the content, the ideas, the software, the services we can created with them.
The downside is that this kind of productivity doesn't correlate as well with our completely physical and organic needs, like the need to eat and pay bills. No matter how many cute cat images you draw (or use AI to draw), it probably won't feed your family.
Resolving that is the thing which will actually usher the next industrial revolution, not beancounting compute power and network bandwidth.
I'm fascinated with the topic of complexity, of the permutations of finite resources. Topics like Kolmogorov complexity, the relationship between AI and compression, and the Metcalfe's law are just the sort of stuff I tend to think about for fun.