Dancing on the Edge of Infinity
Mathematically, math is a lie. In 1931 the mathematician Godel hypothesized that math depends on itself to prove itself. Almost 100 years later, almost everyone agrees with him. Math has a foundation of circular reasoning. It’s like going to a dictionary to find the definition of what is a dictionary. It is what it says it is, because it says what it is.?
All languages are fundamentally abstractions of reality. What makes them useful is how much they help us understand reality. Scientists use models to describe the world and test what works, we all use words as mini-models of the world around us.
I grew up absorbed in the idea of absolute truth. Time and time again I was told that truth is not relative, because reality is absolute (probably because the people telling me this wanted to control me). What it took me decades to learn was that while the world is not relative, the language we use to describe the world is. To borrow an analogy from Alan Watts; the sound that a bell makes when you ring it is very different from the words “a bell that is ringing.”?
A dictionary isn’t what a dictionary says it is. The reality behind a dictionary is all the people that use it, to share experiences together. The patterns of word use around the dictionary, make the definitions in the dictionary meaningful. Computers are powerful because they allow us to see those patterns. Those patterns have always been there, but until we created books with enough memory (i.e. computers), we couldn’t really explore them (sometime ask me about agent-based-modeling).?
Part of the reason why I love learning about computers and artificial intelligence is how they teach me about my own thoughts. Historically I guess humans have always done this through children. Watching children learn, helps us know what learning is. Developing artificial intelligence teaches us something similar, although not quite as complex. Parents are still superior intelligence engineers.?
In artificial intelligence, one of the mistakes an AI can make is “overfitting.” I became so enamored by the idea that I put it on my laptop as a constant reminder. I talk about it any chance I get, including randomly at the bar while working on last week’s article (see cover photo).?
Moving from an “absolute” understanding of truth to a “relative” understanding keeps us from overfitting our words onto reality. It keeps us from getting trapped into thinking words mean something magical, rather than the meaning we give them as society, relative to our shared experiences.?
领英推荐
It’s not a loss of meaning, or a crisis of purpose. It’s an expansion of value as we describe our world. “ and fools in words.” The internet provides a great example; how different is it to read a webpage with hyperlinks, from reading a book. It’s easy to click the hyperlink and understand how some other concept is related to what you’re reading about. If you’re not hyped about relative truth, you should be.?
I recently encountered a beautiful vignette as another example while driving. “Blue – I’m good” by David Guetta played on the radio. To a groovy beat that we all recognize from the mildly depressing 90’s hit, the next generated celebrates the adaptability that underpins mental health. We celebrate relative truth when we celebrate music, that’s what changing the words is all about.?
No matter where I go, it's a good time, yeah And I, I don't need to sit in VIP Middle of the floor, that's where I'll be Don't got a lot, but that's enough for me, yeah
- I'm Good (Blue) by David Guetta and Bebe Rexha
Relative truth keeps us on the edge of what is, and what could be. It keeps us moving into that space of deep discovery, where our current words might not be able to describe the new reality. It’s a posture of learning, embodied deep in our language. It opens up a new wealth of discovery with every conversation. There is an equation for how that looks mathematically, but I prefer poetry; embracing relative truth is like dancing on the edge of infinity.
**** The content and frivolity expressed in this article is my personal perspective, it does not reflect a position of any of the incredible organizations I work with.
Neuroscientist | Entrepreneur | Mental Health Advocate | Professional Nerd
2 年Tim Lipp Great article on overfitting models; an issue not often talked about and of critical importance- both in AI and in life.
We make maximum-value exits possible by making businesses scalable, salable, & able to grow independent of ownership?. In my experience I've learned, CHANGE IS HARD, BUT NOT CHANGING IS FATAL!
2 年I enjoyed your article, the idea of relative or in my own words "close is good enough" is how I see us able to move through the uncertainty in life and still make progress. One of my own favourite ideas is my endless pursuit to find the simplicity beyond complexity. When I find these relative approximations I also find that I can communicate complex ideas in a way that they become simple and useable to average people; because if we can't use the learning and the ideas in a practical sense then how do we progress? I find absolutes are too easy to break, relative and approximate being good enough is something that allows me enough flexibility to make an idea my own, share it so others can make it theirs and in a hyper-changing world that is the most I can hope for, evolving our actions in the right direction, not to an exact point, so we are progressing toward a desired future. I think of it like sailing, I eventually get to where I want to go, versus the exact science of teleportation which is where I think we have this illusion of the absolute precision of AI. Just a brain dump in return, thanks for the trigger.