5G & Edge Computing: Complementary Technologies Poised to Change the World
Wayne Sadin
CIO of PriceSmart, the only operator of membership warehouse clubs in Central America, the Caribbean, and Colombia
You can’t pick up a business magazine these days without seeing headlines like, ‘5G: magic formula for fixing everything’ (or something like that)…but most of us can’t access 5G today. You don’t get through a day at work or at play without being exposed to Edge Computing…but you may not realize it and you rarely read about it unless, like me, you work in IT.
Let’s take a look at Edge Computing and 5G to see how burgeoning 5G deployments, plus our increasing appetite for Edge Computing, will make a huge difference in our work and personal lives:
The Edge of Computing Has Evolved
50 years ago when mainframes ruled the world of computing, things were simple. Computers were huge, expensive groups of refrigerator-sized cabinets (containing storage, memory, compute, I/O, networking) hooked together in central datacenters. We interacted with mainframes using slow input/output devices like card readers, ‘dumb’ terminals, typewriter printers, and maybe—in advanced factories and labs—using sensors that measured real-world variables like temperature, pressure, speed, etc. These I/O devices were connected to the mainframe via painfully slow networks (speeds of 2.4Kbps-9.6Kbps were common)—and we rejoiced that we didn’t have to bring ourselves to ‘terminal rooms’ directly wired into the mainframe! Back then, we had no ‘Edge Computing’: every bit of data was sent back to the mainframe for processing, and that was that.
Today most of us own and use a variety of ‘computers,’ each of which is far more powerful than the mainframes of old: smartphones, tablets, laptops, automobiles, thermostats—and at work we likely use sophisticated data acquisition tools that measure hundreds or thousands of variables; data visualization tools from big-screen displays to immersive AR/VR devices; and robotic devices that open valves, move material, even assist with surgery. Every one of these 5 billion+ phone devices[i] plus 20B ‘Internet of Things’ (IoT) devices[ii] is a computer, and every one of them is connected through high-speed circuits to insanely powerful central computing arrays owned by your employer or operated by hyperscale cloud firms like Microsoft, Amazon, Tencent, etc.
To make matters worse, most of the data we generate and data we need doesn’t come from stationary devices that can be connected via wired networks: we take our phones with us, we drive our cars and carry our tools to jobsites, etc. Mobility is taken for granted today, and our technology must reflect that.
These smartphones & IoT devices—these powerful computers—are ‘Edge Computers,’ because they operate far from the buildings stuffed full of servers that constitute the ‘Core’ of computing. “So what?” you might ask. What’s the problem with connecting 25B or so powerful computers to the Internet? The problem is data—oceans and oceans of data. So much data that current networks can’t handle it (according to IDC, IoT devices alone will generate 79 zettabytes of data/year by 2025[iii] (that’s 79,000,000,000,000,000,000,000 bytes).
This data can be immensely valuable, so we don’t want to lose it. Whether it’s medical telemetry (Dr. Nick van Terheyden predicts that continuous Blood Pressure monitoring will lead to medical breakthroughs[iv]), or equipment data that allows your field techs to perform maintenance pre-failure rather than post-failure, or data on product usage that allows your firm to design ever-better products--data collected at the Edge will transform how we interact with the physical world.
In addition to inbound data coming from the edge to the core, outbound data is transformative: surgeons controlling robotic manipulators from afar[v], vehicle-to-infrastructure (V2I)[vi] networking providing self-driving vehicles coordination to reduce congestion, and many other applications for controlling the physical world via digital connections.
One solution to this data overload is to move some parts of the central computing infrastructure outside the datacenter, and put them closer to the edge (where you and me and our things reside). This way, edge data only flow part of the way through the network—to the edge computing node—which then processes the data to some extent, stores some of it locally, and decides what data must be sent along to the Core and what decisions must be referred to the core. This sounds good in theory, but creates a number of practical issues: extra cost building and maintaining many (many, many) computing nodes far from data centers, increased delay introduced by node-level processing (when core processing is also required), increased complexity of data synchronization across a network of intermediate nodes, etc.). As long as data generated at the edge of our network grows faster than our ability to move that data from edge to core, our options have been limited and our trade-offs more awkward.
Which brings us to 5G
Wireless cellular networks have been in operation since around 1980. 2G[vii] networks (the first Digital cellular generation) in the early 1990s ran at 50Kbps; 3G (1998) ran at 200Kbps; 4G (2009) runs at around 50Mbps today (but as we know, varies widely). Going from 50Kbps to 50Mbps in 20 years (1000x) sounds impressive, but look at network usage growth over a somewhat similar period [viii]:
When network traffic is growing 100x as fast as network speeds—or more—the problem looks serious, and the complexities of edge/core computing start to seem like necessary trade-offs. Unless some technology comes along that radically improves network capacity—and that technology is 5G!
These days (mid-2019) US cellular carriers are just starting to deploy their 5G networks, so data is hard to come by. But speeds of 1.3Gbps (26x faster than 4G) were demonstrated by AT&T[ix] in Dallas, and NTT[x] demonstrated 27Gbps (540x faster) last year. While these test-bed results are interesting and demonstrate what the technology can deliver, it will be a while before we can say what ‘typical’ 5G performance can be. But it seems safe to say some parts of the US will get 10x better wireless performance in a year or two.
As impressive as these increased bandwidth figures are, 5G latency improvements could be even more significant in the real world.
For example, mobile devices will increasingly handle demanding, data-rich applications in every imaginable environment. Imagine video conferencing plus real-time data acquisition from a rugged laptop or tablet at the scene of an accident that can show an emergency room doctor what’s happening on the ground. In warehouses and across supply chains 5G could provide ubiquitous coverage rather than requiring multiple types of connectivity while dealing with huge numbers of sensors and robotic actuators. In manufacturing, 5G-equipped devices can enable real-time 3D modeling and live feeds from production lines around the globe.
Let’s discuss network latency and why it matters.
Latency vs Bandwidth; Where 5G Meets Edge
Network are often compared to pipes full of water, where ‘bandwidth’ equals ‘flow rate,’ but that's not quite accurate. Instead, think of the network as a conveyor belt carrying boxes (packets) of information. If you want to send more packets you can speed up the conveyor belt (reduce latency) or make it wider and put more boxes side-by-side (increase bandwidth). Sounds like both techniques work, right? Here’s the twist: every so often the conveyor belt pauses while a message flows backward to the belt loaders saying “OK” or “Hold Up.” That message flows backwards at the speed at which the conveyor moves. With this complication, are you better off making a wider conveyor (bandwidth) or a faster conveyor (latency)? For traffic with lots of acknowledgments (interactive traffic, like a phone call), faster is better because the return message causes a shorter belt delay. For some traffic (data backups, for instance) with fewer acknowledgments and lots of data, wider (bandwidth) might be better—but that’s not typical of the traffic sent/received by those billions of phones/IoT devices we're talking about.
The other important thing about improved latency is that there is a class of problems—known as ‘hard real-time’ problems—where the time it takes between (Edge) sensing, to (Core or Edge Node) deciding, to (Edge) acting is important. If a robot machine tool needs guidance from an ‘upstream’ computer (Core or Edge Node) to move a cutting tool as a part moves, delays may result in a ruined part (imagine instead that this was robotic surgery or drone piloting and see how important end-to-end round-trip response time can be). The smaller the network latency, the more time the Edge and Core computing devices have for sensing, deciding, acting without causing an error.
Today’s 4G has latencies of 30 milliseconds to 50 milliseconds, whereas 5G should be a just a few milliseconds, potentially 15x to 50x better. And now you see why this 15x-30x improvement is as important, or even more important, than a 26x improvement in bandwidth for many real-world applications. It allows the conventional ‘central computing’ model to support more mobile devices streaming more data and supports hard-real-time physical/digital processes using the same central computing architecture we've run for decades.
To recap, we discussed the value of the ever-increasing amounts of data collected via mobile (Edge) devices; outlined the problem with data growth vs network improvement over time; sketched a possible, albeit complex/expensive architecture of distributed Nodes; explained how 5G can enable retention of today’s central computing architecture in the face of data growth and increasingly real-time applications.
Mobility has already changed the business world. 5G plus smarter edge (mobile) devices will accelerate that rate of change—in the nick of time!
This post is brought to you by Panasonic and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Panasonic.
[ii]https://www.gartner.com/imagesrv/books/iot/iotEbook_digital.pdf
[iii]https://futureiot.tech/idc-forecasts-connected-iot-devices-to-generate-79-4zb-of-data-in-2025/
[v]https://en.wikipedia.org/wiki/Remote_surgery
[vii]https://en.wikipedia.org/wiki/List_of_mobile_phone_generations
[viii]https://www.sciencedirect.com/science/article/pii/S0308596113001900
[ix]https://www.pcmag.com/news/367757/at-t-5g-tested-it-hit-1-3gbps-speeds
[x]https://www.nttdocomo.co.jp/english/info/media_center/pr/2018/1122_00.html
Account Executive at Full Throttle Falato Leads - We can safely send over 20,000 emails and 9,000 LinkedIn Inmails per month for lead generation
7 个月Wayne, thanks for sharing! How are you?
GTM Expert! Founder/CEO Full Throttle Falato Leads - 25 years of Enterprise Sales Experience - Lead Generation Automation, US Air Force Veteran, Brazilian Jiu Jitsu Black Belt, Muay Thai, Saxophonist, Scuba Diver
9 个月Wayne, thanks for sharing!
Building great teams and delivering value from technology investments.
5 年Wayne, quite a good article buddy.... explains the concepts and impact very well.? Insightful too.? Well done. My thoughts as follows. As computing technology migrates to the edge, companies will probably figure out a couple of things: - functionality needs to be assessed differently....the target environments will be way more diverse and finicky - considerations of what should be at the edge from a functionality and storage standpoint are complicated.? Not like what-goes-where wasn't already complicated, but now they are complicated AND different from a requirements, performance and data utilization standpoint - the release management cycle they know (and don't love) will change even further as the number of tech stacks that must be considered grows - a new revolution of hybrid app/embedded technology might just be brewing - at a minimum, technologists who have learned many lessons from the mobile app development experience will look to drive toward standards here early Keep these great articles coming, they are great thought generators.? Talk soon and stay well. Paul
Director @ ABS | TOGAF EA, Cloud Transformation, DevOps, ITIL
5 年Wayne Sadin -- Transformational CIO, CTO, CDO?that is a great article! ?And I love the bar chart comparing voice to data! ?Pictures being worth a thousand words, if #boards?and the #csuite?are not careful they will find themselves unable to align their #compute?resources with their #bigdata!