Reality Redefined: The Pivotal Role of Smartphones in AR and AI Innovations
Leonardo Nunes Ricucci
Supporting Startups in the Alt-Protein Sector | Economist | Bio-entrepreneur | Writer
So Apple made an announcement of another iPhone to be released soon and to be honest It would have gone completely under the radar, not only because I’ve never had an iPhone but also because most of the time these are just incremental improvements on the last I think we are still a couple of years until the next leapfrog of truly innovative consumer technology. But, a very close friend of mine sent me this small piece of data regarding the newest iPhone.” The latest iPhone will have a 3.2 teraflops processor, ASCI Red was the first system ever to break through the 1 teraflop barrier on the MP-Linpack benchmark in 1996; eventually reaching 2 teraflops. The new iPhone is more powerful than the most powerful supercomputer of the XX century. (And this is why I love my nerdy friends <3, also, if your friends don’t come to you with technical specs of supercomputers of the 90s do you even have real friends?)?
And, it is funny how smartphones are preparing for AI cores and AR applications using their processors. This shows us a bit of the trend happening in technology we saw the rise of AI and LLMs since the breakthrough in transformers and models that required tons of computing data now have been optimized to run on a normal smartphone ?So imagine the possibilities when phones have powerful enough processors The possibility of AR+AI being a more prevalent thing after the hype curve in the past years could bring about a new way we interact with the physical world imbuing it with extra functionalities and capabilities.?
Smartphones are the Powerhouse of AR
Augmented Reality, (AR) for those out of the loop is when we add a layer of the digital world on top of the physical world with a combination of computer vision and some cool graphic interfaces. It can also work when we scan or see a specific pattern or image in the real world and then it is processed in real-time as something extra through the screen of the device in front of us. This is a very interesting concept that is used in many things from Google Maps that sometimes might complement your viewing experience in real time providing what are the stores that you see in front of you or where you are based not only on your GPS location but on what the camera of your phone is seeing. Another very well-known example is used sometimes by mobile video games such as Pokemon Go where the Pokemon appear in your phone’s vision as if they were in front of you in the real world.
And I think this is where the HYPE bit more than it could chew, Meta boldly went and diverted all of its horses towards virtual reality and the Metaverse (I mean they changed the name of the damned company) and it seemed like they kinda screwed up on that bet, but its ok, life is filled with bold bets that sometimes pay off and other times is just a huge blunder. Also, do y'all remember Google Glass another device way ahead of its time that kinda stumbled its way into existence and then got killed before it could reach the broad market?? Personally, I think it happened for a transitory period in human history and how we perceived our privacy, more than because of technical limitations.
But both of these failures could be called the early pioneers or the first signs that AR is dead on arrival, but I think it is not because VR and AR aren’t great technologies that could provide a lot of value to humanity. But I think it was just a thing of taking steps in the right direction at the wrong time. (Like so many of us have made in our lives)
Do you guys remember the HYPE cycle we discussed a couple of articles ago , about how tech goes on a rollercoaster of a trend of expectations and investors pouring money until it falls crashing down when reality meets expectations? I think we’re there with VR, but with AR I think we’re already slowly leaping out of the trough of disillusionment and into the Slope of enlightenment (god now that I re-read these terms sounds so much like corpo consulting lingo...) and this new iPhone and the next generation of smartphones complemented with different AI tools might be the coming wave of AR in the mainstream.?
The implications for everyday life are amazing because stuff like virtual shopping or enhanced navigation connected to cameras, lidars, and sensors in semi-automated or fully automated vehicles offers a lot more value and functionalities I think the deeper implications are more entertaining since we are already so screen centric, but we are mostly living in fully digital worlds having to deal with screens and User interfaces designed for a single purpose. An AR-imbued future offers a way for people not only to not live in digital worlds in this screen-centric reality we live in but into more immersive-centric experiences through our devices instead of living INSIDE of them.?
Blending the Digital with the Physical
I think we should sometimes look at technology with the eyes of a medieval peasant in a certain way. (not fully because if not we would just burn everything and shout “HERESY!” as every piece of technology might seem like an affront to god) AR is just like discovering a secret world, a hidden world of magic that lives in this altered plane of reality that can only be accessed using this wizarding language and with special stones that you can see through and access the information that lies hidden underneath the real world.?
There are many uses that we could benefit from having this secret world at the tip of our fingers, like a lot of specialized manufacturing or repairs could be enhanced with an AR overlay so workers can check and compare pieces and specs to factory standards or how to enhance education with AR so students can interact with what they are learning and visualize it in a much more clear and better way than just printed pieces of papers, a youtube video ad best and the sub-par drawing skills of an underpaid teacher at worst. Also, this allows escalation at a cost so millions of students could benefit from a single app or development that could be used and adapted by the professor who’s there with their feet on the frontlines.?
Also could be used for medical procedures because just like repairing or mechanical work having an overlay of how things work or should be working you can combine a computer vision camera with augmented reality and AI and have in real time an assistant to a doctor or a surgeon that can be assisting providing guidance, tips, and information about the condition of the patient with top-notch quality of insight and wisdom based on hundreds of thousands or even millions of cases which could help physicians provide a better service. For example, rescue teams could have an augmented reality display that alongside WIFI Vision could help identify corpses under the rubble and rescue and check for vital signs in real-time so they can focus on victims that have noticeably higher chances of survival or that aren’t under complex debris first and then go and save those in more dire situations.?
These are very basic examples no doubt that smarter people with way more creative brains could come up with hundreds more applications and solutions that have better market fit as well, so maybe we will be seeing very interesting things coming up in this field soon enough.?
Virtual Reality: The Next Frontier?
But no discussion about Augmented reality is sufficient without Virtual Reality, because even though each one has very different uses and applications let’s be honest probably one will follow the other, because if you are able to imbue reality with an extra layer of information and meaning why not simple create a NEW reality with its own information and meaning, and rules of physics and any other parameter that you could think of and simulate that world. Only computation power is the limiting factor to human creativity in this field ( and in many others I’d say.)
And Even though Meta’s Metaverse (that’s a dumb name) failed, alongside others that failed even more miserably. Others, like Activision Blizzard, became VERY quiet about the development of their Metaverse as well , for those out of the loop this one consisted more of a seamless integration of different types of games and styles all within the same game engine or game world which if you think about it Roblox kinda did it on a small scale and with a lot more of abusive contracts and child labor!
Now that doesn’t mean there is still a lot of potential for virtual reality I just don’t see this being adopted until we get REALLY good glasses that don’t weigh and are easy to use (No, the Apple VR ones aren’t even close for massive adoption yet) Or another alternative is that we can connect directly human brains into a virtual world in a safe manner that doesn't mean drilling holes on peoples skulls. Because even though I’m a transhumanist at heart I’m all in favor of transcending our biological limitations. (from the moment I understood the weakness of my flesh and stuff) But I know the resistance of a lot of people to these technologies (myself included) that safety is one of the key concerns whenever we talk about biological and synthetic integrations, especially ANYTHING related to the brain. Sadly there aren’t yet any alternatives or options that would make these hurdles capable of overcoming anytime soon.?
Beyond The Metaverse
Now, let’s be honest the “metaverse” is a sci-fi term that was coined by marketing agencies and the tech industry but the reality is that Virtual reality is always somewhat of a slippery slope regarding technologies, because for me it all boils down to simulation theory, because after we got one good virtual reality display/interface/device then it’s only a matter of time until we can create a virtual reality that is indistinguishable from the real world and then it’s when things can get dangerous, of course depending on how we interact with reality you might see and hear as if you’re there but your body is in meatspace still. But once we begin to connect with our brains unto a virtual world we get ourselves a “Matrix-like” scenario.?
That does not mean we will utilize humans as batteries (which is a very dumb and inefficient way for generating power honestly) but then it’s existentially dangerous because how do we know WE are not living in a simulation, how can you for sure know this is baseline reality and not a simulation of something one layer above us, and so on and so on.?
But beyond the existential dread, I do believe the technology holds a lot of potential once we get there because at that point every single one of us could become a literal god-like being in a virtual world where you can manipulate the parameters that make the laws of physics and “play” in an alternate reality with lower gravity or without entropy or so many other things that might alter how the world functions that is very interesting to see where do we go from there? Will humanity explore the cosmos or even leave their house if they can live and explore infinite virtual realities from the comfort of their homes/simulation pods? It is a difficult argument to make.?
Wait so are we baseline reality?
Honestly, smarter people than me claim we are most likely NOT in baseline reality so not a lot we can do to change that, or even if we could would we do it? What would that entail? Will our simulation be turned off or will we “wake up” in a sci-fi world when the conditions for this simulation are cleared? Who knows. So, yeah Iponhes are becoming more and more powerful thanks to the demand of AI and Moore’s law which will bring forth a possible wave of Augmented Reality applications in the short term which is great and I find it fascinating hopefully you do so too, and don’t fall into existential questioning whenever someone mentions VR and the Metaverse and are able to enjoy the benefits of this technology.