To be greener, the cloud needs to move to the edge
Telcos’ role in enabling sustainable cloud is through decentralised compute.
Traditional operational engineering tends to concentrate activities into a few large locations. The operating principal is that by physically centralising production activities and the supporting resources, we maximise control, efficiency, resilience, automation and thereby achieve economies of scale.
Energy generation is an example of this. For years, policy makers and their advisors argued that generating energy from renewables could never cost-in, partly because these facilities were deemed sub-scale and too distributed which made them expensive to install, connect, operate and maintain. ?And yet enlightened leadership, determination and human ingenuity have proved otherwise.?Much to the sceptical commentators' surprise, renewables now provide much lower cost energy than the centralised alternatives including traditional nuclear power plants.
Streamed media delivery has also become increasingly dispersed. Netflix originally distributed its video streaming from servers in a few centralised locations (and from similarly concentrated cloud services). It now has a content distribution network comprised of over 16,000 servers which are widely distributed deep inside ISPs’ networks. Other examples abound where we are seeing a trend to decentralisation of activities: additive manufacturing (3D printing), military operations and more recently, hybrid working.
True, doing stuff centrally makes it easier to physically marshal resources and control operating environments.?But it also creates vulnerabilities (attack points and single points of failure), transportation overheads, and delivery delays. And then there are inefficiencies arising from market concentration.??
New technologies, particularly information and communications technologies, are turning this thinking on its head. We can now achieve many of the benefits from concentrated operations without the downsides. This trend to decentralisation, enabled by new technology is providing key benefits:
Edge computing is both a key enabler of decentralisation and itself an illustration of the decentralisation of cloud compute from vast datacentres to much smaller ‘cloudlets’. Decentralisation is creating demand for edge compute because distributed activities need more distributed data processing. Edge compute is itself a production activity that was previously physically concentrated. Telecoms operators are ideally placed to meeting these needs and happen to also make excellent anchor tenants with their own – increasingly cloud-native – network workloads.
To make cloud compute more sustainable, we need to re-frame how we think about it. The challenge is that when approaching sustainability, everyone always starts with the datacentre, not the compute.
Today’s mammoth datacentres are becoming more energy efficient but only in a narrow sense of the term ‘energy efficient’. They are getting better (and more efficient) at disposing of the waste (heat) that the servers generate. On top of the compute energy, datacentres only require an additional 20-40% cooling energy, to dump the heat that is generated from the servers. ?This means for every useful unit of energy consumed, they need to use between 1.3 and 1.5 units of total energy. This is referred to as the PUE (Power Usage Effectiveness). It is the metric that datacentre and cloud operators typically use to measure energy performance and demonstrate sustainability credentials.
领英推荐
But, generating waste heat is … err… wasteful. ?Re-use is… well… useful.
This distinction can be emphasised by adopting and setting targets for a measure such as ERE (Energy Reuse Effectiveness). ERE is roughly equivalent to PUE but has the considerable advantage of driving greater efficiency through re-use.?Whereas datacentres struggle to achieve a PUE below 1.3, EREs of 0.3 have already been achieved.? ?
Source: GreenGrid
Energy security and accelerated national net-zero goals mean that PUE (and perhaps soon ERE) will come under scrutiny and then regulation. This has already started in some countries (Singapore, Norway). When households and businesses cannot afford to pay their energy bills, wasting heat in vast quantities will attract legislators’ attention. By re-using waste heat, some (mainly edge) datacentres will be able to achieve EREs of well below 100%.?The reason why smaller, edge datacentres are better placed to achieve lower EREs is that they are better able to address potential demand for waste heat; re-using the waste heat from typical datacentres is inherently challenging.
We can re-frame this challenge as an opportunity – how could compute evolve to support the need for heat in society? ?Some innovators (e.g Heata.co, Qarnot, Leafcloud) are generating the waste heat from cloud compute, literally right next to where it can be readily re-used by distributing workloads in a highly dispersed way. ??
Will today's massive datacentres suffer the same fate and become potentially stranded assets: dinosaurs from a by-gone era replaced by smaller, more nimble, more efficient species better able to adapt to the challenges of a new, tougher climate??This may not appear to be an immediate threat, but given the right conditions, things may change quickly for the cloud. Only a few years ago, it was unthinkable that to meet their net-zero goals countries would legislate for the demise of the internal combustion engine. In 2021, the UK announced that no new fossil-fuelled vehicles would be sold from 2030. Other countries are following this lead.
Telecoms operators should see this as a big opportunity. Astute investors should also take note.?Compare Tesla’s PE ratio to VW’s (even following Musk's recent antics) to understand the risks of assuming that the current approach to cloud infrastructure is necessarily sustainable: financially and environmentally. ?
Worldwide Marketing & Product Marketing Leader | Advisory Board Member | AI | Cloud-native | O-RAN | Kubernetes | Analytics | Service Assurance
2 年Excellent article Philip Laidler! Telcos are uniquely positioned to bring computing resources closer to where they are needed and thereby lower transport costs and carbon footprint.