Edge Computing - Didn't we do that already?
In a world where compute and storage is making a general exodus into the cloud, the idea of pulling some of it back into local data centers may seem confusing. "I just finished my cloud migration, why would I even dream of pulling it back now," is a perfectly legitimate response. The answer to that question lies in the fact that IT is continually evolving and with each of these evolutions the "right fit" for an application or function of your IT can gradually shift. This won't apply to all functions in your IT but for those who require instant access to large data sets, extremely low latency connection, or run a high level of IoT sensors and devices in a location, it will become a necessary use-case.
With the cloud becoming more prevalent, bandwidth costs, data transmission costs, and latency is continuing to rise. Applications that once relied on a nearby end-point are now having to travel through many more access points to get to you or your users. Edge Computing solves this by placing the key data into devices in a data center near you or at your own location. These devices may serve as just a hub to provide data or provide the compute you need for your location. In the case of a hub for data, a company can utilize services like Amazon Redshift to do analytics and then download results on a timed basis to edge hubs which users at these edge hubs can then quickly and easily access, and avoids data transmission costs of users requesting duplicate data. Utilizing AI, it can learn the data users are most likely to access at each edge and then automatically download just that data to the edge hubs, ready to go for user consumption, and only downloading data from the cloud when needed.
But the real winner here is IoT. IoT data collectors are usually simple sensors with no data analysis capabilities. They have to send their data to a cloud analysis tool in order to learn and execute improvements. This works when you are going for a long-term, generalized optimizations solution, but what if you want your IoT devices to do an analysis of a current situation and react accordingly? Your options consist of either adding a lot of processing power to the device itself which can be costly, power draining, and make the bulk of the device so much so that it no longer works efficiently at its original task. Or you can have it offload the data to its local edge computing platform, and combined with 5G technology, allow it to now act as a full thinking, networked unit. Adding AI and Big Data Analytics to your edge computing means you all your IoT local devices now have a shared low latency brain, allowing it to learn, interact and share its results with other IoT devices on the edge. This will be a critical component in the Intelligent Era, allowing cars to share data immediately on road conditions ahead, factory robots to be aware of any delays or avoiding other robots on the floor and allow services such as remote surgery to be operated in real-time.
Citrix recently demo'ed a new security system that allows access to systems such as a meeting room based on your location. With Edge Computing, your IoT devices can confirm the user is on-site utilizing tools other than GPS which can be defeated, and allowing your own local security devices such as face recognition, RF tracking, and biometrics to confirm the user is in fact in the office they claim to be and all of this in real-time without additional spend on bandwidth costs and latency to the cloud.
While the general idea of Edge Computing may not be new, the applications of this technology have found new life with AI and 5G. Providing low latency, meaningful data and compute to your physical locations will be a game changer for those who use this new tool correctly, and will be a great complementary technology to your cloud solutions.