What's new/coming in consumer wireless techs and the role of standards

What's new/coming in consumer wireless techs and the role of standards

In a recent article, I spent some time talking about various wired connectivity protocols used in automotive. In this article, I cover what's to come about popular consumer wireless technologies, namely, Wifi, Cellular (5G, 6G), Bluetooth and some other emerging protocols. But before that, let me highlight some basic attributes of how these technologies all work.

The basics of wireless technologies

Wireless protocols all work with emitters and receivers that use resonators to create invisible "waves" that can spread through air or other material (including walls, human flesh, etc...). Antennas are used to transmit and "listen" or monitor these waves in a given "band" that's shared amongst all the participating devices. Modern wireless technologies typically operate in several bands and strict rules apply to the use of the radio spectrum within a standard. Amongst the reasons for wireless standards to exist is the need to ratify how digital information is "mapped" over these waves (in terms of amplitude or a slight change in frequency which is called "modulation") and also address "good neighbour" policies of what devices can emit or receive information since, unlike wired protocol like ethernet where traffic is only shared with devices connected on the same wire, wireless traffic is truly shared amongst all devices operating in the same band and sometimes, multiple concurrent standards can even compete for access of a given radio spectre (like the 2.4GHz band used by Wifi, Bluetooth and other protocol). A microwave oven operates in the 2.4GHz to agitate the molecule of your favourite dish to generate heat - which is why it isn't rare for Wifi transmission to stop working when you bring your phone into your kitchen.

Wireless standards and generation of standards have different tradeoffs between transmission range, throughput (max quantity of information you can send), latency (time taken between sending/receiving digital "bits") and power consumption.

There are general physical aspects also: the higher the band you use, the less radio waves can traverse obstacles. The higher the band you use in the radio spectrum, the more data you have a chance to transmit per second. The more power you use, the longer the range you get. The more devices you have in a band, the lesser the overall bandwidth will be available.

Some of these basic rules (oversimplified but relevant) are the reason why radios are highly adaptive to their environment: for example, Wifi can voluntarily slow down the pace of the radio wave and increase the amount of power to the resonator of antennas to cover more range, at the cost of limiting the amount of traffic (i.e., bandwidth) used. All radios also have a "deconflicting" or "good neighbour" traffic scheduling methodology. Imagine a room with 300 people taking different languages at the same time with no one to orchestrate who is allowed to talk... Well, wireless standards all offer distributed or semi-distributed algorithms to ensure everyone has a chance to "talk" - or transmit/receive information.

Ok, that's it for the basics. Now let's delve into some of what's coming for the most popular consumer-facing standards that keep evolving, generation after generation.

Wifi

Wifi is entering its 7th generation (codename "802.11be" for the techies). Should you update your old router? Maybe. First, it's worth noting that Wifi, like Bluetooth, has done a great job at maintaining backward compatibility. It means that your old receiver or transmitter can still work with the newer stuff. It also means that if you want to benefit from the increased bandwidth (going from "9.6Gbps" with Wifi 6 - "Gbps" meaning billion-of-bit per second - to a staggering "46Gbps" which is equivalent to 1500 high-res pictures transmitted in less than 1 sec), you are going to have to update both your router and device set (laptop, phone, etc...). This is part of the reason why it takes time for end-users to benefit from wireless tech improvement (I'll go over that in more detail for the cellular case).

Latency is going to get another boost with Wifi 7 (boost meaning that it will take less time for data to transit in the air). Why does it matter? Because of use cases like gaming or video calling, there must be no significant delay "gaps" when transmitting pieces of information. Remember the early Skype video calls twenty years ago where you could hear your voice with a half-second delay... It was so annoying that you just wanted to hang up.

Nowadays, delays of anything less than 20ms (which is 2x 1/1000 of a second or the equivalent of blinking your eye 50 times in 1 second) are sufficient for many use cases. It's worth noting that the travel speed of light, which happens to be valid for "ideal" radio wave, is 300000km/s which means that it takes less than 20ms for a signal to circle the globe.

Now, this is an ideal case. Remember that the air is shared and that your radio wave will decay in "quality" with increased range, obstacle and congestion (presence of other devices). So what Wifi 7 has done is devise new strategies to reduce delays in challenging environments, including "QoS" techniques (for Quality of Service) where depending on the type of traffic, and type of devices, delay will be optimized/prioritized to be kept to a minimum.

Another thing Wifi 7 is doing to improve the overall traffic bandwidth is to spread traffic over more than one band in parallel. Let's take a moment to understand this. The marketing genius behind wifi router companies (Netgear, D-Link, Cisco, Asus, etc...) have done a great job explaining that their latest router can deliver x amount of Gbps. The number they put out is very misleading because it assumes no interference, little obstacle, and that you would have 3 devices solely connected to the wifi router in 3 separate independent bands (like your phone would use the 2.4GHz band, your laptop would use the 5.0GHz band and another device would use the recently approved 6GHz band).

The problem is that, before Wifi 7, you kind of have to run 3 different wifi networks in your household and manually decide which device would pair with one of the wifi networks. Even the geek that I am never troubled to do this.

With wifi 7, the way traffic is spread across available bands is done automatically (this is the "MLO" or Multi-Link Operation feature). Does that mean you get the full promised "46Gbps" overall aggregated bandwidth -> No. But it does make it easier to tap into this potential.

To the question, "should you upgrade your router?", my rule of thumb is to way 1 or 2 generations of devices assuming you replace your cell phone every 2 or 3 years.

Cellular (5G, 6G)

Cellular technologies like 4G or the more recent 5G bear their name from the fact that they connect to a "cell", which can be pictured as a large antenna tower you see everywhere (called a base station) that distributes traffic amongst temporally connected devices in a given geographical area. As you move your device around, your cell phone transparently switches its communication link from one base station to another without you even noticing (as a matter of fact, these standards were designed to accommodate certain device speeds, like >~200km/h for 4G).

Base stations are themselves interconnected using high-capacity "backhaul" links which can be wired or wireless. Improvements in cellular technologies include bandwidth, latency and robustness to interference. The frequency bands used go beyond 6GHz - up to 52GHz nowadays (people refer to mmwave radio because the higher the frequency used, the shorter the distance between 2 peaks of your radio waves which also correlates with physical travel distance).

Now as I explained in the intro, the higher the frequency of the radio waves, the more "attenuation" you'll face in trying to transmit or receive information as waves traverse obstacles. To mitigate this, "mmwave" base-station infrastructure needs to increase the number of relay towers and rethink the cell density and overlap strategy (this topic is fairly complex).

This is the reason why 5G is still being deployed as it represents a major infrastructure investment (imaging telco companies having to buy land to set up their antenna, particularly in highly populated areas where there is also a limit in the number of concurrent devices a single base station can handle).

This explains why, any new major update in cellular technology takes years to reach mass adoption (on average 7 years) as the graphic below suggests:

Duration between mass adoption of a published standard

Similar to wifi, your high-end end consumer device tends to already integrate the advances of wireless standards knowing that the infrastructure will ultimately catch up.

This represents big investments and bets for the electronic ecosystem and that's just the tip of the iceberg.

"Telcos" (T-Mobile, Verizon, Att, etc...) have to buy "spectrum" to be allowed/legit to emit or receive over the air.

Yes, you heard me, unlike Wifi and Bluetooth that use an "unlicensed" set of bands, "air" isn't necessarily free in the Cellular world.

In the US, the powerful "FCC" (for Federal Communications Commission) is auctioning pieces of the available spectrum to the highest Telco bidder for new bands that haven't been used/regulated so far. It's like saying if you want to dominate within a band, you will have legitimate to use the air for your business (and decide or not to share that bandwidth with others). Crazy right? And we are talking billions $! Check out this website: https://www.fcc.gov/auctions/archive

Of course, this raises the question of what happens if you use a band without permission, who is checking, etc, etc... which I will not cover in this article (I recall a discussion with a Telco exec that once told me "buying spectrum is like buying a house - except you have former owners and immigrants still hiding in the garage" :)).

Ok, so what's next for cellular: after 5 comes 6 and the table below is just a swag at what people are talking about:

The main expected difference between 5G and 6G

As always: first, more bandwidth, less delay, less energy used to receive or transmit (I forgot to mention that there is a typical dissymmetry in energy consumption or bandwidth available between transmit and receive - the "data download" use-case being always more favourable on these metrics than "upload" to speak simply).

More specific to cellular technologies, 6G promises crazy device displacement speed (as if 350km/h was not enough...) and some interesting device localization techniques, which I think are more of interest to indoor use cases but an interesting outdoor by-product stemming from the fact that you will have even more base station and cellular antennas, so by sheer mean of triangulation, you'll get more position accuracy).

Bluetooth

The Bluetooth standard is one of the few wireless standards that go beyond the definition of how radio should work (referred to as the "physical" layer definition) and, like the famous USB wired protocol, it specifies how categories of devices (webcam, speakers, keyboard, mice, storage device, etc...) should behave for interoperability reason starting by a standard way to advertise their capabilities.

Audio is a prime example that made Bluetooth successful and the reason why most earbuds and modern wireless headsets integrate Bluetooth connectivity and can work across a vast ecosystem of hosts.

Bluetooth is still in its 5th generation with incremental releases happening every ~2 years, going from 50 to 200meter range from earlier versions and better battery life. It also continues to standardize audio-related features and features to aid with device localization.

Many Bluetooth devices already feature the ability to be paired with more than one receiver/phone and switch hosts. This was particularly important when you switch the use of your earbud between your phone and PC. Yet it was a convoluted operation requiring disconnecting, messing around with settings, etc... With the new Bluetooth standard, you can easily switch audio streams between hosts. There are also notification provisions for folks with hearing aids.

The complexity and interoperability of Bluetooth is often a curse for engineers but if there is one thing it does well, it's audio. It is also very frequently used as a side-channel to register a device on a wifi network.

Will "matter" ever matter?

"matter" is a recent device discovery and pairing protocol initiated in 2019 by Amazon, Apple, Google and the "Zigbee" Alliance. Subsequent joining companies include IKEA, Schneider, Samsung, etc...

"Zigbee" is another wireless protocol used mainly for IoT devices (~mid-range, ~low bandwidth, ~very-low power consumption) that's largely unknown to the general public but popular in home automation equipment (light bulbs, door locks, etc...).

In the current matter standard, the goal is to standardize the way devices advertise their presence and facilitate secure pairing/provisioning across the Android, Apple and PC ecosystems. It uses basic internet protocol and is somewhat independent of the radio used (Wifi, Bluetooth, Zigbee, a newcomer called "Thread") to enable devices to know about one another.

The use of a QR code and your phone's camera is the primary device enrollment technique suggested, which I still find cumbersome.

While the goal of this standard is noble, I feel it lacks the magic of what Apple has been able to do with their iPhone peripheral where the minute you open the case of your AirPods, iOS wakes up and allows you to confirm using the peripheral. I can't think of any better/simpler user interface and matter isn't there yet (to be fair, its ambitions are also wider). Apple could achieve such a great experience because they control the production of the phone and the AirPods in a very vertically integrated manner.

While having devices know about one another across wireless technologies has merit, I remain sceptical about the future success of matter. Time will tell. One thing the standard offers is "meshing", which is also available with Wifi and Bluetooth. The idea is to expand the range of a network by having devices relay traffic that doesn't necessarily concern them. In a way, this is what range extenders do but range extenders are dedicated devices whereas the concept of device meshing is to use the radio of existing devices to route traffic optimally (in terms of range and/or bandwidth). It's like saying Alexa device is also a router.

The concept of wireless mesh isn't new but it has faced a slow adoption mainly because it calls for a lot of interop rules, more complex radio designs and ... maybe those wireless equipment providers have more commercial incentive to sell you wireless extenders than to promote mesh equipment ...

Conclusion

Ok, this is a very, very short overview with deliberate simplifications.

"Frantz, you forgot to mention short-range UWB that's now available in most cell phones". True, that's a new standard that's emerging enabling high throughput data exchange between 2 phones or a phone and a display for example.

The thing is, if your phone doesn't clearly expose these new capabilities via user experience or OS features, who cares...

I'd like to conclude with 2 broader thoughts: interoperability and why aren't we trying to reduce the number of wireless radio standards?

On the subject of interoperability, the rule of thumb is that, the more you try to standardize, the longer it takes to get everyone to agree, and the longer it's going to take to confirm that everyone actually agreed. While standards are useful, they may or may not be desired (for example, if you have an approach that gives you a competitive edge like Apple has with their AirPod experience).

The corollary to this could be articulated by saying: "well, given it takes so much effort to build standards and some are quite similar or complementary - why not merge to accelerate the interop pace".

This is happening all the time but the stakes can be huge. When I was working at Qualcomm, the idea of merging Wifi and Cellular was on the table as Cellular was deemed a better radio than wifi, even in a household context. Whether you agree or not (I don't have an opinion as things are changing too fast on that front), the thing is put yourself in the telco shoes that went to spend billions on updating their infrastructure or buy spectrum and who, in the name of standards, would suddenly have to "give away" the impact of those investments for a one-time low-cost purchase of a wifi router?

The world of tech isn't a long quiet river, especially when business dynamics are concerned.

#wireless #wifi #5G #6G #bluetooth #MATTER #standards


I hope you enjoyed this article. As always, feel free to contact me @ [email protected] if you have comments or questions about this article (I am open to providing consulting services). More at www.lohier.com and also my book. Here to subscribe to this newsletter and quickly access former editions.


Christian De Figueiredo

Technical Leader at Parrot Faurecia Automotive

10 个月

Really interesting article Frantz Lohier ! Thanks !

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了