Technology - Jevon & Kardashev
Liam Holohan
CTO - Deep Generalist - Experienced IT Leader - Transformation - Strategy - Product - Innovation - Founder. Driving impact through innovative and industry defining tech
A recent comment by an AI company CEO at the World Economic Forum in Davos got me thinking about technological efficiency, waste and sustainability. Declaring that an energy generation breakthrough (nuclear fusion) is required for the future of AI made me chuckle. If ever there was an admission of the waste and inefficiencies within technology, this was it. Blaming science (Physics) for the current limitations of engineering (Technology) seems pointless.
Any power generation breakthrough with nuclear fusion, while monumental for the world, will only solve a symptom of technology inefficiency, not the cause. I would have hoped that at the world's premier economic meetup, there would have been an economist present who could have explained the implications of energy usage in technology and Jevons paradox.
Increased efficiency in a process or service (to save resources) leads to cost reductions in supplying that process. This brings increased usage of the service leading to increased use of the original resource - Jevons paradox
Improved efficiency alone does not decrease resource consumption; but actually, increases it. As someone once said about cars and climate change:
It's not the miles per gallon, it's the gallons
Extreme efficiency alone will not save us, we have to re-evaluate how we interact, architect and use systems. I hope to highlight the resource utilization implications of our decisions & consumption. We now have a few planetary-scale platforms and their use has planet-wide consequences. I will examine a few of these technologies and hopefully highlight the implications of our usage.
This is not an article lamenting the loss of a simpler time or a luddite manifesto. Technology has the capacity to greatly improve the lives of all of us and long may its benefits continue. I just want to highlight the "hidden" cost of our actions when technology is misused or used without any thought to the consequences (both as individuals and organisations).? I hope to frame the delivery and our current use of what are; in effect planet scale platforms; into a sustainability conversation.
Efficiency will eliminate current waste but is silent on the drivers that produce this waste in the first place (thanks Jevon). This applies both to us as individuals and the behaviour of corporates. A few quick back-of-an-envelope calculation opened my eyes to just how dumb things are. This has implications for many of the products and services we take for granted.
Waves of waste
The internet turbocharged the network effect in many areas, including in wastage and our ability for gratuitous usage. We could put the evolution of this in a timeline and examine the implications of these waves of waste and how technology has contributed to it.
All of the providers of the following planetary-scale services can do more to reduce resource consumption at source to avoid planet level problems. Any platform that has obfuscated/abstracted the cost or consequence from its userbase will result in prodigious waste and the accumulation of rubbish in the delivery of that service. It destroys self-curation & efficiency leading to endemic waste. We are now all undisciplined producers with no restraints. To give a flavour of the current enormous resource consumption we will look at the following technical domains:
Social media (the input)
At the turn of the century, we all started getting hooked on social media for personal and business reasons. Initially it was predominantly text based (USENET etc..) but soon gave functionality for images and video posting. Social media is a content distribution industry to monetize eyeballs (or surveillance capitalism if you want to be cynical). The easiest thing to investigate is the generation and storage of images & videos that is inputted to these platforms (to engage these eyeballs) and the resource consumption implications of this content. As any marketer will tell you about audience engagement:
A picture is worth a thousand words - various
But from a resource consumption perspective:
A picture costs a million words - Author
Words cost bytes of storage; images consume ~megabytes (1,048,576 bytes). As phone cameras have improved, we have gone from 1-2MB per photo to about 10 times that. That's a lot of space required per selfie or cat photo. The raw input content for social media platforms is getting larger at source in concert with improved devices. Even if downsized on upload, this consumes compute resource to achieve this. The resource consumption, however does not end on content upload.
Taking a selfie or a sunrise photo is not a single event. It is backed up (usually many times, multiplying its storage footprint), distributed geographically for delivery to followers/eyeballs, siphoned off for processing (to sell more stuff/generate clickbait for these eyeballs). Every picture you "upload" gets replicated repeatedly and stored; usually forever. The original creator may delete it, but that has no bearing on any copies distributed during its lifetime or its continual processing.
A huge storage demand (but even more resource hungry) applies to videos. Suffice to say video sharing platforms are endlessly increasing the length (size) of their allowable uploads making matters worse. While Social media platforms are usually "Free" at the point of use (your eyeballs are the product they sell, not the content you generate) this does not mean it is without cost for the platform provider or the planet in terms of disk, compute, electricity (mainly coal, gas or diesel generated) and cooling.
Service providers will provision resources to meet the demand for this data landfill. As a "free" service you will never see the cost. Any self-censorship & discipline prior to posting tackles resource demand at source. Don't post for posting sake, (adding to the pile of rubbish) for your 150 followers[1], wait for something important or of value to share. Remember with your Social media activity, in the increasing storage usage hierarchy: Text, Images and Video (bytes, megabytes and gigabytes).
Big Data (the pit)
Rubbish expands to fill available space - Parkinson's law applied to data
The uncontrolled accumulation of rubbish within Social Media platforms applies even more so in the business world. This data wasteland is now called big data. The abstraction of cost from action & the disconnect between producer behaviour and resource consumption that occur in the retail world all exist in a business setting too; even more so.
Adding to this is the fear of competitive disadvantage (data poverty) and regulatory compliance. This means big data for corporates has become industrial scale data landfill. Corporates have systems to produce even more content programmatically than a person with a smart phone, so the generation of corporate big data is not people limited like a social media's userbase, but internal systems limited.
Note that is it called big data, not "smart data", "clever data" or "fast data". In its raw state it is not even considered knowledge or information. The vast majority of corporate data is created, used once, stored and forgotten about (and backed up repeatedly in perpetuity).
In the past, the link between action and cost was enforced via quotas in on-prem storage devices. Cost is a macro factor now (we wait for a CFO to see a bill after the damage is done and then instill some discipline in OPEX). With the hard quotas of the past, a capacity management function mandated regular pruning based on proximity of headroom to quota. This had a direct impact on costs reduction as we lived within our finite means. Nowadays we belatedly rely on FinOps to get us out of the resource (financial) hole we have created.
Our behaviour is based on the incorrect assumption of resource abundance. Now every document goes through multiple version control (implying multiple backups) and stored in the cloud before the final copy. Drafts are blindly backed up regularly and the art of data pruning has been lost. We architect systems to appease a fixation of infinite scalability which is rarely required. For databases and log files, we have daily (redundant) backups created and stored in other cloud availability zones. This is to appease regulators or preserve some as yet to be determined competitive advantage.
A fear of regulatory infractions has made us lose sight of the context and reasons for backups in the first place. A couple of things should inform any data retention strategy:
In general, the wasteful and frivolous resource consumption within the retail/social media world is magnified and industrialized in a corporate setting. With cloud-based resources we have all become producers, but not yet grasped that we also now have procurement responsibilities. We operate under the illusion of infinite technical resources. Remember that:
Perfection is attained not when there is nothing more to add, but when there is nothing more to remove - Antoine de Saint Exupéry
IoT (the output)
We are rushing headlong into the "smart-ification" of devices (Homes, Cars, lights, toasters, fridges, watches etc..) and a general explosion of devices that constantly measure. Welcome to the world of IoT (internet of things). While there are valid reasons for doing this (possibly to eliminate waste and increase efficiency), I wonder about the measurement and data processing overhead required to achieve this extra control.
Before we look at the data created, it is worth remembering that many of these devices in there "pre-smart" days already had a control/feedback system in place, It could be an open or closed feedback loop, typically something as simple as a duty cycle to manage their process (think heating or cooling systems with temperature sensors). The system was self-contained, operating within it's feedback loop, with no data export requirements.
When writing a control system, the sampling rate and feedback mechanisms are key. What you are controlling has a bandwidth. Sampling is how many times you measure something in order to control the process. There is a large body of study in this area and perhaps the most fundamental is Shannon's sampling theorem from 1949. Put simply, it states you should sample at twice the bandwidth of the signal you are monitoring. Sampling at twice the rate is sufficient to completely reconstruct/control the signal.
A fridge will increase / decrease its temperature on a timescale of minutes (that's just physics and is its bandwidth). If you want hourly control of your fridge temperature you only need to make a measurement every 30 minutes, any more frequent sampling is wasteful by definition.
IoT is wasteful by design in many ways:
领英推荐
I believe that IoT will place more storage load on internet infrastructure than social media has to date. This is particularly true when the "compute" of this IoT data is not on the "edge device" itself. If the IoT is not doing analysis then this requires data transfer (and all the backups that go along with it), compute on the data and finally action to close the feedback loop.
Remember even if you appliance is only generating a byte of data every second, there are 31,536,000 seconds in a year (and a fridge lasts about 12 years)
Blockchain (the calculator)
I don't think I should say much about blockchain as its inefficiency as a datastore is well known. Suffice to say if there are multiple copies of a blockchain (distributed ledger) for participants (to implement trustless-ness via consensus) this is data replication on a massive scale. The act of mining is computationally expensive by design (asymmetrical functions).
In fact, with crypto currencies, the entire system would be meaningless if the creation of currency did not require the expenditure of a lot of computational effort (think energy) as then the currency would not be a store of value. But from a waste point of view, we have a massively redundant data store that is computationally expensive to write to. This should be remembered if you want to use a large blockchain as a platform that holds low value assets for a wide audience.
AI (triple inefficiency & waste)
Yes, we are all amazed by AI at present. While it seems to be doing many wonderous things I looked at how terribly inefficient and dumb it actually is in its current implementation. AI at present is the triumph of data over algorithm (because it is easier). This has been made possible by the apparent abundance of cheap cloud-based compute and storage resources, rather than any deep AI magic or fundamental changes to mathematics.
To make matters worse most of this data is processed numerically and not analytically. Anything solved analytically has its intelligence built into the equations (formula) a priori that describe it in an exact form. It is the acme of efficiency that always holds true. A numerical solution is an everchanging guess based on statistics that needs to be fed data to improve accuracy.
A picture may be worth a thousand words, a formula is worth a thousand pictures - Edsger Dijkstra
The triple inefficiency of AI comes about because of:
Training - The training of AI systems (LLMs are the technical plat du jour) is a compute intensive operation needing a feedstock of prodigious amounts of data (which has to be stored and transformed). ChatGPT produced about 500 tonnes of CO2, with similar sized emissions for other AI platforms.
Operation - Once trained, General public AI then operates for all of us to interact with via prompts. ChatGPT quotes 175,000,000,000 parameters in it's LLM, comparing this with the 8,089,000,000 people in the world (early February 2024) smells of massive inefficiency, brute force & limited "intelligence". What are we really getting here?
Current estimates are ChatGPT infrastructure runs on 285,000 processor cores and 10,000 graphics cards. That's gigawatt scale electricity consumption (even being kind and assuming a PUE of 1 in the datacentres). For comparison the human brain operates on ~10W-20W of energy. General public AI systems we are agog about are over a billion times more energy inefficient than the human brain. Nuclear Fusion is the last thing we need to address this.
The reason Apollo 11 got to the moon wasn't due to the 4KB RAM & 32KB hard disk of the Apollo Guidance Computer (55Watts power required) but because there were 3 low power (~10W-20W), exascale supercomputers on board called Neil, Buzz and Michael.
Output - While we have now seen the huge energy inputs required to build AI models and operate them once trained. Hopefully its applications will address some of the inefficiency and waste problems besetting the world. Most interaction by the general public with Generative AI to date has been to add even more data rubbish to our landfill or give sales and marketing more eyeball catching content.
AI generation is not limited to human keystroke speeds, so the scale of generative AI has serious implications for storage demand. We can now fill available space with rubbish on a massive & rapid scale. AI posted to social media with text, images and videos will of course go on to be endlessly backed up too. Generative AI has turbocharged the wasteful resource consumption practices of Social Media as its first "killer app".
Cloud hosting (the enabler)
The Cloud is not some abstract, nebulous thing. It is simply a large-scale deployment of energy hungry datacentres across the world. The electricity, cooling and latency (location) implications of datacentre deployment has not changed in any fundamental way by calling it "The Cloud"; nor has it's sustainability implications.
While they enable the use of technology at a massive scale, they also compete locally for electricity, water (and diesel) to operate. It is not an infinite resource. Public Cloud has lowered the technical barriers to entry for platforms, including allowing you to make bad technical choices at the click of a button.
Cloud has abstracted all the tin, wires, electricity & water implications away from technical architects; but also obfuscated the cost and resource appetite of your decisions. Technologists are all familiar with Infrastructure as a Service (IaaS), Platform as a Service (PaaS) etc... offering of cloud; but we should remember that our behaviour has allowed us to industrialize bad habits and sloppy decision making. Our inefficient & wasteful platforms can now hide behind the infinite resource promise of hyper-scalers.
On the flip side of cloud, we should consider what it allows us to create in its "as a service" promise. Are we delivering Laziness as a Service? Waste as a Service? Pollution as a Service? Energy Consumption as a Service?
Dark hand of Kardashev
If you gave any physicist 3 wishes, they would only require 1 - "Infinite joules". All of the above discussion on technology waste comes back to our use of energy (joules), Be is solar, fossil (old solar), atomic (pre-solar) or even the CEO's wish of fusion power.?? If we are happy with the current delivery of technology and see it only as a energy problem, why stop at demanding nuclear fusion, we could leap straight to demanding Dyson Spheres and progress along the Kardashev scale. ?
The Kardashev scale is a method of measuring a civilization's level of technological advancement based on the amount of energy it is capable of using. (we are not even a Type 1 civilization yet).
I think this scale ignores advancement in terms of improvement in energy utilization and waste reduction as a measure of civilization advancement. The generation of a joule of energy via solar power seems to me to be a mark of more advancement than the generation of equivalent joules from oil. Perhaps we need to redefine Kardashev's scale in terms of an efficiency rating of energy utilization?
In Summary
The planetary scale systems described above can be inefficient by design and have allowed wasteful practices to be normalized. Tackling current inefficiency will not reduce future waste on its own. Only changing our interaction with these systems can. Planetary scale systems magnify our behaviours on a planetary scale. Remember as uncle Ben said to Peter Parker:
With great power comes great responsibility - Stan Lee (SpiderMan), Amazing Fantasy #15 Stan Lee (1962)
Or as technologists & consumers we should heed what Ian Malcolm said to John Hammond when making decisions as we interact with technology:
So preoccupied with whether or not they could, they didn’t stop to think if they should. – Dr Ian Malcolm, Jurassic Park (1993)
It is estimated 50g of carbon was expended in the creation of this LinkedIn article and I will refrain from posting further until there is something of value to share.
[1] - The average numbers of followers per user on social media platforms is 150-500. It is a winner takes all paradigm where a small number of users have enormous share of followers while the majority only have a few. That does not mean the majority post less than the "influencers".
Web Developer
1 个月On efficiency and resource usage, it's the old chestnut of if you are heavily restricted in compute and memory resources, it focuses you to be efficient with what you have. The sanctions against China have been a blessing in disguise for them - forcing them to look at more efficient ways to do more with less. Maybe a carbon tax on storage and compute is what's needed.
Internationally Known AI and Cloud Computing Thought Leader and Influencer, Enterprise Technology Innovator, Educator, Best Selling Author, Speaker, GenAI Architecture Mentor, Over the Hill Mountain Biker.
1 年I love the fact that we’re putting everything on the table, also, being honest with the waist that’s out there now. All of this can be fixed, it needs to be fixed, it’s not only burning carbon, but it’s burning dollars. Sustainability is good business.
Reading tea leaves with Artificial Natural Intelligence
1 年A reality check for the Silicon Rush for Market Capitalization by the BIG TECHS. It reflect the selfishness of big technology companies, investment bankers and venture capitalists. Many technologies were created with the sole purpose of monopolizing computing as a service, enriching the providers such as Google, Microsoft, Amazon etc. with little or no economic benefit to the users.
Technology Strategy & Architecture Advisory at Protiviti
1 年Lot of ground covered there! It sounds like we need more of this from big tech: https://thefrugalarchitect.com/