What do Apple's new AirPods tell us about its augmented reality glasses future?
Robert Scoble
Follow me on my new AI podcast, Unaligned. Tech industry color commentator since 1993. Author/Blogger. Former strategist @Microsoft .
Careful listeners to Apple's new AirPod Pros that shipped yesterday can hear some interesting things about the future. What have I heard?
- It improves the real world. The ability to move between not hearing the real world, to hearing the real world, to "mixing" the real world with media you are listening to, or phone/video calls, is a lot better than on my other headphones. More on how it improves the real world in a bit.
- The audio reflects Apple's philosophy of human factor design. You can see Apple's camera warming up people, making some images a bit too yellow at times. Same with the audio. My other headphones, including an over-the-ear Sony WH-1000MX3 that's almost twice the price, have better technical sound. Or so you think. I find that for most cases, at most times, Apple's sound is actually less stressful to listen to. For instance, my Pioneer Rayz headphones have more bass, at first listen, but after a while the Pioneers are actually more tiring on your ears. Apple's audio choices are lighter and easier on the ears.
- The audio is better integrated into the experience. One example of this is I was working on a Google Doc, listening to some music on Spotify, then clicked on the microphone to do some speech to text transcription. On Apple's headphones the audio level only dropped by 60%, the music was still playing even as it listened along and was changing my speech into text. When I tried the same, because I never remembered being able to listen to music and transcribe with voice before, it didn't work on my other headphones. Worse, I noticed my other headphones weren't nearly as fast at transcribing text. I already had noticed that Siri was a lot better on these headphones than on my other ones, and I think there's some secret sauce going on here.
The affordance for future features is also higher on these. Apple is already teaching you to touch the headphones to switch on and off the real world. What if it did that automatically?
That is what Apple's glasses will excel at. Turning the real world on and off, or up and down.
This morning I was at a Halloween event at my son's school. I was squeezing the stalk on my new AirPods, playing with turning on a new mode that let you listen to the real world, which they call "transparency" mode. The other mode removes most of the noise from the real world, AKA "noise cancelling." It worked very well. Even in this noisy crowd when I set it to noise cancelling I heard near silence and when I switched it to transparency mode I could hear everything clearly, including my wife who was in front of me and would tell me "there's Ryan."
What's weird is after about 10 minutes of wearing the headphones I took them off to hear the real world in real life. It wasn't as nice. The earphones were actually "augmenting" the real world and amplifying it.
That got me to think back to the days when I worked at Microsoft and saw one of the first array microphones. Back 15 years ago a researcher had put four microphones into a bar on top of his computer and was using software to "focus" the microphones on someone speaking in front of it, to remove noise from the room. My Pioneer Rayz headphones do just that with six microphones. Apple is doing the same with only two and it works very well. People can hear me even in noisy situations and I will do some more testing on this over the next few weeks as my wife and I travel to Barcelona for VMware's big conference there (she's one of the people who works with all the speakers).
This "augmenting" and "mixing" technology, though, while subtle, really makes using these much more enjoyable and, even, increases my trust and love of the Apple brand and ecosystem. Today I was on a Zoom call with an entrepreneur, and a family member is in the hospital with cancer. I walked through the lobby, up the elevator, and around the floor to his room all while listening to my Irena Cronin, my partner, interview the entrepreneur. Then I switched into transparency mode to say hi to my wife and the family member, but he was busy with the doctor. I could switch easily between listening to the Zoom call and talking with people in the room by riding the mute button in Zoom and touching the stalks to go between noise cancelling mode, where I couldn't hear anyone in the room, or transparency mode, where I could hear everyone talking quite clearly, even as the Zoom call was going on.
That got me to realize just how Apple's glasses could change literally everything about how we work with other people. Let's say I'm in that room in the future, wearing the glasses. It could watch my eyes, and watch whether people are looking or gesturing at me in the room near me, and turn up and down the volume accordingly.
This might seem weird and anti social to you as I discuss it "why are you wearing headphones while talking with a dude who has cancer?" But if it weren't for these headphones I couldn't have even been in the room at all, so wouldn't have even gone for the visit (I couldn't get out of the call). And take it out of such an emotionally charged place, let's say we all go to a nightclub. I might wear these AirPods for hearing protection. They dramatically reduce the sound levels that get to my ear drums. But then if I'm trying to talk with you, the glasses will see that you are turned toward me and talking to me. It can turn on transparency mode automatically and focus the two microphones on your mouth. If it knows where in space your mouth is it can do that very effectively. Why? Spatial Computing will be able to see where your mouth is, due to a 3D sensor on the front of the glasses, and will be able to tell it's your friend, through some form of face recognition (not the kind that will get the privacy concerns up too much), and can tell your friend is actually talking to you because his/her eyes are looking at you, and his mouth is opening and closing while doing that, and turn on the microphone.
If all three of us have the new headphones and new iPhones, we'll probably all have the new U1 chip inside, too. Why is that important? Brian Roemmele covers what that is, and why it's important here.
Hint: Apple could build a new social network with this chip. It would know who is holding or wearing devices near you, and where they are, with this chip, and it would have a high-speed network to send data back and forth to your friends with these devices too. So, a future headset could actually use the microphones your friends are wearing to further augment your hearing, and, add more microphones to a focusable array of microphones. This could get quite nuts, especially in a place like a very noisy concert or night club. Already stadiums, like the one in Las Vegas that we visited, have 5G, and that enables all sorts of wild video and audio sharing. Imagine going to a concert with friends, being able to play blackjack with them while the concert is going on, with your glasses/headphones turning up the music so you can enjoy the concert, then turning up your friends talking to you, and the music slightly down, and being able to hear them crystal clear. The use cases for this kind of thing are all over the place. Better conversations in gyms. Better Zoom calls while you walk around the real world. Even more enjoyment of your front yard. As I type this I'm sitting out front waiting for trick or treaters. My headphones are removing all the noise from the cars and traffic in front of my house, while letting me switch to talking mode as people come nearer.
A big shift in how we think of wearables is coming and these are just giving us a little tiny taste.
More on my thinking on three Twitter videos:
Video on new AirPods and why I'm bullish on Apple stock (Part I). Here's Part II. Comparison to Sony's over the ear headphones.
Experienced Online Reputation Management (ORM) & SEO Strategist, SEO Director and SEO Consultant Professional Musician AaronKronis.music for more. New Album entitled, Generation XLNT out in 2025
5 年Dope article dude - I want he new EarPods to wear while -sports
Properly designing and managing technology development engagements at global scale. Contact me at [email protected] or text "hello" to +1 215-804-7916
5 年It's too bad I'm one of those people who simply can't keep any Apple earbud/earpod in his head (or any of those similarly designed copycats). Sounds like really cool technology. I see people at the gym using these things during vigorous exercise, and they just stick in their ears as if they are glued there. I try these things and sit perfectly still doing Zen meditation and they leap out of my head as if propelled by compressed air.
?? CES 2025 ? NEXTGEN DEVICES ? XR ? AI ? 5G ?? Magenta Ambassador ??
5 年I’ve been making the case internally that the AirPods are a key component (along with the Apple Watch and iPhone) of the Apple PLAN (Personal Local Area Network) for headworn. Robert Scoble your article cogently states why. Thank you. And also thanks for pointing out that the U1 chip and UltraWide Band matters. One disappointment: I think Apple needs to get over its self marketing and deliver more color options for the AirPods. They should talk to people who wear hearing aids.
Founder & CEO | Developing Innovative Therapeutic and Wellness Solutions
5 年The product now sounds really compelling even though I'm wondering whether in some cases the real world noise is canceled when some imminent or impending harm should be heard.
COO - Trickshot #sport #3D #web3 #XR #AI #socialmedia #digital #innovation
5 年Great article and capturing the future of an AR world outside of visuals.