5 Tech Predictions for 2022
When something doubles in size, each leap is as big as every previous leap put together. And technology is catapulted along by laws of doubling. There's:
The result as these forces combine? Massive, sometimes shocking, changes. Ending with machines merging with humans as we hit Kurzweil's Singularity in the far and distant future of 2045.
That's two decades away, but we're getting glimpses of what's coming. Here's what I expect to surface in 2022.
1. Real time cybersecurity
The old model of security is failing. Just look at the frequency of hacks. Back in June 2021, LinkedIn hit the headlines for a data breach that affected 700 million users. More recently, data from 1.2 million GoDaddy WordPress customers was also exposed. And now we’ve now heard that Panasonic has been hit by a data breach.
It’s a mess.
In 2022 we'll see the rise of real-time monitoring, detection and resolution. Traditional batch processing or reactive fraud detection algorithms don’t work in a continuously evolving digital native world. We need real-time cybersecurity.?
Real-time cybersecurity builds on concepts such as SD-WAN, Zero Trust, and Next-Generation Firewalls with ongoing device inspection, deep packet inspection, and AI analysis of behaviour. Cyber criminals can't be excluded at the perimeter. Real-time analytics ensures that even if there's a breach via a novel method the trespassers can be identified.
Intel has done a brilliant job at detailing this approach to security in a recent white paper.
So how can businesses get real-time ready when it comes to cybersecurity? One approach is to leverage Apache Kafka and its ecosystem, which provides low-latency performance at high throughput, in conjunction with data integration and data processing capabilities.?
Every enterprise is different and flexibility is key for your cybersecurity initiatives. As Kafka provides a flexible and scalable real-time capability for collecting this data - it’s a no-brainer for most cybersecurity systems. By taking the situational awareness data and applying a signature, it can alert operators in real time, which in turn could prevent a major incident. Best of all, Kafka is a completely agnostic solution that doesn’t rely on any one vendor as it decouples the dependencies between architectural components and can be deployed across multiple cloud vendors.?
Cybersecurity is the responsibility of everyone in the company, but while attacks are becoming more sophisticated, in as many as 88% of UK data breaches , human error is the cause. So if any data is being handled manually by your business, it may be worth looking into Kafka-native tools such as Kafka Streams or ksqlDB which can scan, process, filter and aggregate data in real-time, at scale, and then feed this into Splunk or CyberArk for further alerting to any risks.
2. Computing at the Edge
Driverless cars can't wait for data to take a round trip to a data-centre. Even 5G latency will lead to pile ups.
The vehicle needs on-board compute power to make decisions – known as Edge Computing.
Whenever a remote device processes data on its own, that's Edge. For example, aircraft run onboard data-centres, since an internet connection can be a little glitchy at 40,000 feet. IoT devices, too, are ideal for Edge technology.?
There are many reasons to install Edge computing power. The motivation may be latency, bandwidth, or the difficulty of connecting to the internet. Ericsson has put together a great whitepaper explaining the Edge landscape .
Edge computing will boom in 2022. We've already seen great examples, such as HS2 deploying intelligent devices to collect telemetry data to monitor and protect against environmental changes, while alerting staff of potential structural changes for preventive maintenance. It's just the start.
The challenge is how to improve performance at the Edge.
Kafka is the perfect technology to deploy on these devices, it can act as an integration layer between legacy devices and also double up as a storage medium for all the data captured.
Standardisation is key to successfully deploying a fleet of devices at the Edge. A single device without context and a known data model is useless, as the time it would take to extract meaningful value from the data is too great. Traditionally this was pretty much impossible as the cost of semiconductors and the range of chip architecture on such primitive devices made a scalable rollout not economically viable.
With the introduction of Kubernetes and especially the K3d distribution, it’s now possible to deploy a standard Platform as a Service on something like a Raspberry Pi. Using a repeatable deployment model like GitOps, we can install, configure and run Confluent for Kubernetes at scale across multiple devices - each with an optional link to the cloud to send meaningful updates - if and when it has connectivity.
领英推荐
3. Carbon footprint calculators?
According to HSBC’s Made for the Future report, almost half of UK companies are planning to increase environment-related spending with 69% focusing on making manufacturing more sustainable and 66% on improving internal practices.?
It’s money well spent. It will soon be law to track emissions.
UK based companies over a certain size will have to report and show to HMRC how they are meeting the emission standards set from 2023 onwards.
Unfortunately, carbon emissions reporting on public cloud infrastructure is very immature so it will be up to businesses to innovate on how best to get the data they need.?
There are open source tools that could start businesses off in the right direction, these include Cloud Carbon Footprint and Carbon Analytics , which are some of the best. At OSO we can work with you to build your own bespoke multi-cloud carbon calculator, just get in touch .
With accurate data you can make decisions on how to lower your carbon footprint. For example, there are over 60 data centres in Scandinavia that are sustainable, extremely resilient, and powered by renewable energy. When it comes to lowering your IT infrastructure carbon footprint, companies may want to explore running their services in these data centres rather than choosing AWS, Azure or GCP.
Ovo Energy suggests that if each person in the UK sent one fewer email a day it could cut carbon output by more than 16,000 tonnes a year. Worth it? We'll only know when we have the right data.
4. Multi-cloud deployments
No one wants to get marooned on a single cloud. Not when it's possible to spread the load across multiple clouds and let the vendors haggle for your custom.
This HashiCorp whitepaper offers a detailed look at the benefits of a multi-cloud approach and why it will become much more prevalent in the future.
Essentially, a multi-cloud model lowers risk and reduces cost.?
Segregating sensitive workloads on separate public cloud providers offers enhanced security as decentralised systems are by design, harder to hack. Having the flexibility to run your workloads in a self contained way makes infrastructure a commodity. This in turn opens the opportunity for leveraging under utilised resources and further operational optimisation. Also, if regulation changes in one region, you have the ability to continue business as usual in another, thus derisking your overall business.
Containerising workloads is mandatory in order to seamlessly transition across platforms leveraging Kubernetes as an orchestration layer. Following a purely declarative approach using GitOps and products like Weave Works Flux paves the way for repeatable deployments at scale in a controlled and audited manner.
A word of warning however, adopting multi-cloud is not a decision that should be taken lightly. It requires careful planning and if not done correctly, can have the adverse effect of increasing costs and the attack vectors. That’s why the decision makers should collectively come together to align on their cloud strategy before execution.
Highly regulated industries, such as banking and healthcare, would benefit the most from adopting a multi-cloud deployment model, as the structure would lend itself well to the complex needs of these industries.
5. Data Mesh
The term data mesh was coined by Zhamak Dehghani, Director of Emerging Technologies at Thought Works. Discover the full details on Data Mesh here.?
Zhamak defines the data mesh as “a socio-technical shift - a new approach in how we collect, manage, and share analytical data.” It's not a technology. It's a philosophy. Data is shifted away from a central lake to the teams that consume it. It's also about handling data at scale across the enterprise.
One way to build a Data Mesh is to develop a Data Domain Blueprint. This can help standardise things like schema design, data governance and interoperability. In order for it to work effectively, businesses must have a unit of architecture that encapsulates domain oriented data over long retention periods and code in place that can look after this data.
Teams need to be able to use data in their ordinary jobs. For example, no-code/low-code platforms give front line teams the ability to change the UX or product mechanics without consulting an IT team. With easy to access data their roles are transformed from passive observers to creators.?
Data Mesh is getting serious traction, and it's easy to see why. Retail is built on data. Banks want to use data to hyper-personalise loans and insurance, and do it in real-time. All digital ads are based on data. Every sector can benefit. Yet organisations too often lock data in a central silo, unable to offer it to the teams who need and can utilise it most.?
Switching to a Data Mesh will change the way data is used in organisations. One to watch in 2022.
Excited for the year ahead? Or need help making these kinds of connected experiences a reality for your business, get in touch today
Head of Business Transformation | Quema | Building scalable and secure IT infrastructures and allocating dedicated IT engineers from our team
1 年Sion, thanks for sharing!
[email protected] phon 00963999602506
2 年Good luck