When Competition is Replaced With Care
Gerd Altmann: https://shorturl.at/aGQbA

When Competition is Replaced With Care

I was riding the train home from Boston to New Jersey after a work trip last Friday and had a friendly chat with the guy across from me. I asked him what work he did and he said he worked for Amazon and we discussed GenAI. When I mentioned concerns about hallucinations, IP issues, water / energy usage and so on, he responded:

"But we have to keep up with OpenAI to compete with them to stay in the game."

The Comfort of Care

(Author's note)

I wrote the rest of the article below in a spirit of frustration and anger.

(From "The Clamor for Competition onwards).

I'm keeping that content as I hope it serves a purpose. Maybe there are some stats or insights that could be useful.

But I think I buried the lead, which is why I added this foreword, which is this:

It's easier to be competitive than to give care.

For me, at least. But I think it's fairly common.

Giving care to oneself: mental health, spiritual health. Challenging.

Giving selfless care to others - volunteering, tithing, asking for forgiveness:

Very challenging.

Determining how global metrics of caregiving could replace deep paradigms of power and competition:

Almost impossible to imagine. (And thank goodness we have Riane Eisler , Kate Raworth and many others showing the pragmatic visions of care for people and planet).

So I'm adding this self-aware introduction because for myself, the spirit of anger and frustration isn't innovative. So it's not fair or true or helpful to be angry at others or "society" and say someone else should be innovative but I get to say the same things in the same way I've said and written about for years.

So for today - Sunday, August 4th, 2024, for whoever reads this, I want to point out I actually dream of a form of care that can seep into the pores of all people to change their hearts and guide new and positive actions.

Where I demonstrate my concern for water, I should note I consider water to have a spiritual element to it beyond the fact that it apparently comprises a good portion of every human's physiology.

So where water usage is unknown or its being depleted I feel spiritual pain I'm not sure how to best express in ways the modern world can understand.

Especially when I or anyone calls water a "resource." When basically, it's "life."

I've been struggling a lot as of late, because I recognize that in my anger and frustration over aspects of GenAI and economics, I am not providing a spirit of joy or anticipation or possibility.

So for any of that mottled mess you see below or in general - I ask forgiveness.

From who you may ask?

If I say, "God" will you keep reading? Your call.

But let me clarify.

For me, "God" is a word representing the portion of my heart, and inner self (which I'll call 'soul') where I experience genuine stillness and peace.

And I believe this peace comes from a source other than myself. "God" is the shorthand I use.

As an example of this stillness and the presence of God (as I interpret it for my life), I'll give you an example. I ordered a new battery for my iPhone because I left my other one at a friend's house. I ordered it from Amazon, which I recognize as ironic due to the story I've shared below and for supply chain and sustainability issues in general.

As I've also mentioned, I struggle with hypocrisy and how to live in and not of the world.

Nonetheless.

As I was walking to a neighbor's house where my package had been delivered by mistake, I rounded the corner of my building and was arrested in place by the movement of wings.

It was a monarch butterfly.

She was fluttering about before alighting on a bush where she was apparently warming her wings in the morning sun.

Letting the water on her wings return as vapor to help the rest of the world.

And I couldn't move. My thoughts cleared.

I was simply in the presence of nature. And I think this may be "God."

And for a moment, from someone who is very aware of his awareness and who thinks most or all of the time about depths of thinking and ethics and thoughts, I was able to bask in the beauty of those wings, and the sun's warmth.

And. It. Was. Good.

Philippians 4:7 in Christian scripture talks about a "peace that passes all understanding."

As of late, having been immersed in the study of Artificial Intelligence ethics, personal data, augmented reality, and the patterns surrounding media, economics, and business for almost twenty years, what gets my attention most of all these days is -

Nature.

Kindess.

Humility.

And care.

In that mix comes the peace. And as a thinking thinker's thinker, I LOVE that it's a "peace that passes all understanding."

Because understanding can be overrated if it keeps you from love stemming from awe.

What's your butterfly? Where do you find peace? And do you call it (He/She/They) "God?"

And when you experience God, what happens?

I'm asking because I'd like to know.

I am quite tired of trying to be right. Proving my point, quoting wordly sources to gain credibility.

I'd much rather rest in the presence of Nature and invite others to do the same.

This seems like a true route to peace. And a natural way to care for nature because where one identifies true peace you don't use words like "resource" to define something so precious and valuable.

Okay.

I hope the rest of the article is useful and I've shown some level of innovation in adding this preface.

And whoever read this - I hope you find care for yourself on this Sunday (or whatever day it is when you read it). And I hope it inspires you to care for someone else.

Maybe even a butterfly.

The Clamor for Competition

This made me sad on a number of levels, but the main weariness came from what exactly it is that this kind and seemingly very intelligent person hoped to achieve by emulating OpenAI. The easy answer is profit, market share, and bragging rights. But as Gary Marcus has pointed out numerous times in the past few years, there is much more to worry about regarding Sam Altman and OpenAI than laud in terms of a business model to emulate.

Here's a quote from a Guardian article Gary wrote yesterday about a key aspect of OpenAI I think about a lot as of late:

Furthermore, generative AI, made popular by OpenAI, is having a massive environmental impact, measured in terms of electricity usage, emissions and water usage. As Bloomberg recently put it : “AI is already wreaking havoc on global power systems.” That impact could grow, perhaps considerably, as models themselves get larger (the goal of all the bigger players). To a large extent, governments are going on Altman’s say-so that AI will pay off in the end (it certainly has not so far), justifying the environmental costs.

The last sentence of this quote contains a number of concerning assumptions:

  • Governments are trusting a person whose former Board Members don't trust.
  • "AI will pay off in the end" is so vague as to be its own form of existential threat. What type of "AI?" Just GenAI? "Pay off" in monetary amounts for a tiny group of shareholders or is this intended to be a typical techno-utopian notion that "all will benefit?" What is the timing and description of the 'end' in this statement?
  • Who get to say what environmental costs are justified? And as someone like myself or the amazing Masheika Allgood JD, LL.M knows if you do any research regarding the use of water and data centers, the primary conclusion one can make with assurance is that there is not accurate data about how much water is actually used in regards to GenAI if one goes to reports from the data centers themselves (a 2022 Uptime Institute survey from over eight hundred data centers noted that only 39% reported their water use. This number increased by a small percentage in 2023, but it's critical to note that "environmental costs" today in this example would only be able to potentially include less than half of the actual usage of water in just those data centers who even report their water usage. (This is critical to note when reading environmental reports from organizations like Google who note in their 2024 Environmental Report that, "Our water stewardship projects replenished approximately 18% of our freshwater consumption from our data centers and offices—tripling our replenishment progress of 6% in 2022." This is the same report The Guardian reported on in early July in their article, Google's emissions climb nearly 50% in five years due to energy demand . Here's a quote from the article:

The tech company, which has invested substantially in AI, said its “extremely ambitious” goal of reaching net zero emissions by 2030 “won’t be easy”. It said “significant uncertainty” around reaching the target included “the uncertainty around the future environmental impact of AI, which is complex and difficult to predict”.

So in this example, Google can't say with assurance that "AI will pay off in the end" regarding the environmental impact of AI. And for all the voices and organizations saying AI will help the environment more than hurt it, "in the long run" or "in the future" remember water.

Remember that less than half of data centers powering AI report their water usage.

At least to the general public. One can assume with assurance that Google and Microsoft know how much water and energy they're consuming.

But rather than sound condemnatory to these companies or data centers, my ultimate goal in this article is to imagine what would happen if the metrics and incentives of competition were replaced with care?

Sinnovation

"Don't hinder innovation." The cultish phrase used in any business or policy meeting to actually, in fact, keep new ideas from happening regarding moving society forward.

Do we actually believe ideas around avoiding "too much regulation" for companies are unique?

This phrase means: "Keep things the way they are in our markets and measurements."

Competition.

There are only two types of competition I enjoy. One is athletic where sportsmanship and collegial efforts on a playing field let people whose joy comes from their sport be played out in public. Where essentially, without the constaints of the industry around sports, sports are a form of play.

Secondly, things like "battle of the bands" where the play is focused on creating music. Maybe winning a free meal.

Beyond that, by definition, competition for business means not only that certain businesses fail because people don't want to buy their products or services, but that an entire industry or society are prioritizing the wrong things as success metrics.

For instance, the justification of GenAI's massive increase while what is certain about water usage is that less than half of data centers even report on said water usage.

This means the phrase, "don't hinder innovation" also means, "we cannot wait to move forward even knowing we are massively harming earth's ecosystems, biodiversity, and most people today and for the next few years.

Because of competition.

This is "sinnovation." While I recognize there are many positive uses of AI (beyond GenAI in isolation) that are helping the environment, what scientist, academic, policymaker or designer worth their weight in water can justify the following statement:

While it is clear that actual water usage from the world's data centers is largely unknown, where up to half of water usage is not even reported, any usage of LLMs are still "Responsible AI" or "AI for Good" where its designers talk about their intentions for a potential future where said future has also not demonstrated how the world will look in the next few years when this occluded water usage is made further evident from the ramifications of drought, climate immigration, and biodiversity and species loss.

The current "sin" in "sinnovation" is not that competition exists or companies need to make money. We all need to make money. The "sin" is in society allowing or justifying any damage to the environment where we aren't calling for the accurate measure of what (and who) we are destroying now in light of what might happen in the future.

There is no "responsible" AI where we don't know how much water we're using.

So here's an invitation.

Winnovation

I've had multiple conversations over the past two years on the issue of water. People who are ignorant about aquifer depletion, the growing prevalence of droughts, how water is used in data centers (fresh water versus gray water versus chemical coolants).

Many of those conversations have helped educate me in things I didn't know, and I appreciate the people who educated me. For one thing, where data centers haven't been incentivized to report on water usage, there's the simple opportunity to incentivize them via any number of things like subsidies, tax breaks or other options. These need to be offered in light to an entire community surrounding a data center benefitting from the enlightened data sharing around water, however, as many times privately owned data centers won't provide data about water and energy resources they're using because it's considered company IP.

The data regarding water and energy usage affecting entire communities and populations.

Because of concerns around competition.

Which means we need to also incentivize the deeper issue of the incentives.

Which could and need to be focused on care.

That's the real win.

"Winning" in light of "competition" may not make sense as we're used to picturing an individual person or company as a winner.

But where water is concerned only a few people having it means most people and species die.

Is this too morbid to point out?

Is this unpleasant to discuss?

Is it easier to quote inaccurate statistics with phrases like, "only 1% of the world's water is affected by GenAI" as justification when it is well documented that a) any stat talking about water, as I've listed above, comes from sources not including the fact that less than half of data centers report about water, and b) it's an "end justifies the means" argument, when reports from companies like Google keep saying, essentially, "while we're one of the most sophisticated mapping, analytical, and statistically oriented organizations in the world, it's 'quite hard' to measure the environmental effects of AI so we'll tell ourselves and everyone else we'll use the very tools we know are harming the planet where we of course know our water and energy usage down to the most minute levels of data but will say publicly we don't."

This is innovation?

OpenAI's lack of care for human IP overarchingly is innovation?

The ongoing acceptant of the paradigm of LLM hallucinations, in other fields and verticals known as "errors" is innovation?

And this kind and obviously intelligent guy on my train says, "We have to keep up with OpenAI?"

What are we "keeping up" with?

The comfort of competition

This - this I understand.

It is so much easier to be competitive than to care.

Here's where I fully understand, and suffer from "sinnovation."

If I sound angry, I am. Because it's easier to be angry and point fingers than to ask how to change one's actions or words.

My anger comes from a number of places. Many stem from being bullied as a kid.

But I'm 55. And I have more blessings than many or most people in the world.

Yet self-care is very hard at times.

I may have made it sound like it's easy for data centers to report on water usage. But it's not, apparently. As I mentioned above, many of the data centers haven't been asked about their water usage in the past years.

A point I still cannot fathom. (And yes, I use "fathom" as it's a term for measuring water).

But there it is. If nobody has asked about water usage, then the real opportunity for innovation is to - here we go with a crazy idea - learn how much water we're actually using.

Likewise, if nobody has positioned the need for caregiving as the primary metric of measurement in AI - not vague concepts of "good" or "the precautionary principle" or "responsibility" where no specific metrics are recommended -

Let's be truly innovative and prioritize caregiving, where metrics from the SDGs, or measures of biodiversity or mental health or other metrics normally called, "squishy" are prioritized.

Because you need water for something to be squishy. So we should measure it.




That's a thought-provoking question. Measuring progress through caregiving could lead to a more empathetic society. It would be interesting to explore how this approach could impact our social and economic structures. How do you think this shift in perspective could influence our understanding of spirituality and faith?

回复
Colleen K.

VUCA leader immersed in the study of Complexity to help us gracefully flow through continuous change and transformation. We are each a work in progress!

3 个月

This is so relevant. Recently when my elderly mother was in the hospital I saw the direct effects of what happens when competition in healthcare leads to a system driven by profit and how patients lose out.

Jo?o Otero

Mentoria para Cria??o de Apps de Crescimento Organico (linktr.ee/joao.otero) * Founder at Cogni (cogniapp.com)

3 个月

jovens competem, idosos compartilham... é a ordem natural e tem um sentido evolutivo; o sexo é que fode tudo, mas o sexo é necessário. mesmo num mundo utópico onde sejamos todos iguais, ainda assim os mais bonitos e atraentes seriam melhores que os outros

Amber Hawkes

Protecting the vulnerable online, using technology for good. Online safety expert. Ex Facebook, NGO leader, criminal prosecutor.

3 个月

In the trust & safety/child safety space there have been and increasingly are examples of sharing signals/data/solutions - acknowledging that competition is not the end game. BUT I still reflect on how much was not shared earlier or is still not really shared and that there is still some level of competition that is unhelpful. Some observations: 1) So much of the sharing is done or has been mobilised through individuals who have caregiving metrics as their personal/spiritual motivations and KPIs - and not originally as a result of any corporate caregiving KPIs (2) further developments have of course come in response to regulatory pressure/the threat of regulation which imposes caregiving KPIs (eg recent global regs imposing “duty of care”) (3) data transparency enables the measurement of duty of care… and yes, much data (although not all) is possible to collect but the “competition” framework impacts what data is collected and shared… it is very costly to track, if other companies don’t track that metric, why should we? There are still no good incentivization/tax break frameworks in this space, it is all about penalties which I think is a miss, it just limits the speed of progress toward caregiving

Sunil Malhotra

Nowhere guy | author of #YOGAi | designing from the emerging present | founder ideafarms.com | white light synthesiser | harnessing exponentials | design-in-tech and #AI advisor

3 个月

Love the piece, John C. Havens. My tweet this morning, "The material Universe is for all beings to enjoy and for none to exploit." Notice I said material. That is a fundamental tenet. Max Planck argued that the concept of God is important to both religion and science, but in different ways: "Both religion and science require a belief in God. For believers, God is in the beginning, and for physicists He is at the end of all considerations … To the former He is the foundation, to the latter, the crown of the edifice of every generalized world view”. One scientifically appealing idea of God is the bold claim of Advaita Vedānta (non-dualism) that all human beings experience God here and now: in fact the Advaita Vedānta idea of Brahman is founded on the knowing Self, and is explained as pure existence, pure Consciousness and pure bliss. Brahman, or pure Consciousness, underlies the knowing self. God, therefore, is nothing apart from the first-person experience of each and every living being.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了