The 4Cs of Computing

The 4Cs of Computing

I'll be turning 54 next month. I know, I barely look a day over 52. In my brain, however, I'm still waiting for the morning I wake up and feel like a grown-up. But despite my increasing years and abandoned hairline, I can't really remember a time in my life when I wasn't surrounded by computers.

My dad, having retrained from electronic engineering in the first part of his career, to being an organisational psychology academic in the second part, was put in charge of computers in his department at Birkbeck College. As a result we had computing devices knocking around at home from when I was about 10. From early dumb terminals and acoustic couplers to the BBC Micro and Atari ST, there was a world of possibilities available to me (even if the memory and storage were minuscule in comparison to today).

I studied a bit of Computing at university alongside my major in Sociology, and after graduating soon fell into working in IT departments where I've found myself, on and off, and within or outside, for much of my subsequent career.

As we surf the tsunami of hype that is engulfing the industry at the moment, I've been thinking back and trying to make sense of the different ways in which computing devices are used and, more importantly, whether there is a disconnect between what is being sold, what is being bought, and what might actually be delivered. My conclusion is that we might well have misalignment and, at worst, active misselling.

Where I've got to is a bit of alliteration which from the outset is a bit misleading. We are going to talk about the 4Cs of Computing, but Computing itself is the 5th C. The ability to compute, to process and execute large volumes of mathematical stuff is still at the core of what a computer does. But I sense that we have mostly now abstracted away from just pure maths. The maths of computing goes into delivering the 4Cs (or not) as follows...

Control

Control is the happy place for business computing contemporary management practice. Once the very first electronic computers had been built to solve maths problems like the Enigma machine, we started to see them soon being deployed to bring control and order to business processes.

The Lyons company built the first business computer, LEO , in 1951 in Hammersmith, London. Its first task was to manage the costs of cakes and bread, and it was soon deployed to control the flow of inventory and money (particularly pay) within the organisation. Enterprise Resource Planning (ERP) was born.

In most organisations of any age, the control of stuff, mostly money, is the bedrock of business computing. That's why so many technology teams still report to the CFO.

And for controlling stuff, computing technology has done an OK job. Not great, as the proliferation of spreadsheets surrounding most ERP systems will tell you. But most people in most organisations get paid most months on time and with the right amount. And sometimes invoices get paid on time too.

Communication

With the advent of time-sharing computers in the mid-1960s we started to see computers being used to allow people to send messages to one another. The first email traversed the ARPANET (the precursor to the internet) in 1971, and in the 1980s online bulletin boards started to allow communities to flourish online (teenage me racking up phone bills included). AOL, messaging apps, SMS, online conference calls, Zoom... communication tools in organisations have developed tremendously in the last 30 years.

Organisations, though, have struggled with communication tools. In the 2016 report I researched and wrote for Leading Edge Forum W ho Shares Wins , it was clear that organisations were distinctly struggling to know what to do with the plethora of cloud-based communication platforms that were becoming available to them. And this is, of course, before the pandemic changed the landscape further.

Why are communication tools so hard for organisations to grasp? I think because the DNA for how computers are to be used goes back to Control, and Control and (interpersonal) Communication are very difficult to align. Just witness how much business and political exchange happens in WhatsApp. Communication, like water, will find a way through whatever controls are put in place.

Creativity

The first thing I ever did on a computing device was to write a letter to my gran. I then copied it out by hand from the screen because printers for the home were yet to be invented.

In my teens I wrote computer programmes. Pretty unimpressive games, mostly.

And then I got into using computers to produce music. From that point onwards, I've always seen computers as devices of immense creative possibility, mostly shackled by the need to run spreadsheets (although I know quite a few folks who can get pretty creative in Excel).

What you can produce today with a basic computer, whether in PC or phone form factor, is mindblowing. Video recording and editing on a phone blows my mind. The first project I ran at the BBC back in 1996 was the introduction of a Media100 editing suite, for which the fast hard disk array of about 50GB cost us about £25,000. I can get 64GB on an SD card for £10 right now.

In most organisations, though, other than the financial tomfoolery in spreadsheets and choosing fonts in PowerPoint, creativity on computing has been confined to the strange Mac-using community of the Marketing department. Why? Again, I think the need for Control can dramatically impact the need to give people the space necessary to explore the creative possibilities of the tools we have on our desks.

If you remember when email was first introduced there were strictures about it not being used for trivial things. And yet it wasn't until people started to use it for creative and trivial things that they started to get their heads around how it could be useful.

The same goes for the World Wide Web. And I reckon a great deal of the social acceptance of video calling today comes from the pub quizzes and family gatherings that happened during lockdown where people could use the tools in ways that didn't have the pressure of immediate deliverables that modern businesses seem to expect at every turn.

Clairvoyance

The final C is the controversial one. Computers have been sold for many decades as tools that predict the future.

They can't. They can only extrapolate the past.

But there is a deep-seated human need to try to find ways to remove uncertainty from our worlds, and what better way to do that than to invent a magic machine that will tell us what happens next.

This was the sales pitch with management information systems in the 1980s and 1990s. The myths that were sold about beer and diapers or pregnant teens. The systems called "Sage" or "Oracle" or "Delphi".

It's a great pitch. It's really sellable. It's also an illusion of extrapolation.

4Cs and LLMs

Having been thinking about this over the last few weeks, what I see is a disconnect on the horizon. How most organisations are best geared to use technology (to control) doesn't sit well with what LLMs are good at (aiding human creativity) or what organisations are easily sold (clairvoyance).

LLMs aren't great at being deployed to control things because, by their nature, they lack of a level of predictability. You might stick some natural language processing over a customer journey flow, but ultimately you'll still need a fairly rigid decision tree underpinning it because otherwise LLMs have a tendency to go bad .

They're not that good at being put into communication channels, either. We are already seeing LLMs talking to LLMs where there used to be humans at either end. The only people who win out of that nonsense are the energy companies who are powering the AI data centres (the AI compute providers aren't making money from it, that's for sure).

And as for clairvoyance, well, the anthropomorphism inherent in many of the current waves of generative AI combined with the authoritative style in which much of the content is produced should make for massive warning signs. If I type "What should I do next?" into an LLM the responses it provides, particularly in the context of complex problems, may or may not be right, but they'll be presented in a way that is highly convincing.

That's the territory of Hedgehogs rather than Foxes, the terminology that Dan Gardner and Philip Tetlock used in their Superforecasting book to make a distinction between bad predictors of the future and the good ones.

Bad futurologists are bombastic, confident, single-minded, and roll themselves up in a defensive ball if their predictions are challenged. Foxes are wily, adaptive and far less bold in how they think the future will play out. Foxes are much better predictors, but get far less coverage because their predictions are pretty boring.

There are amazing opportunities for how some of these generative technologies might be used, assuming the industry can find ways to reduce currently ever-increasing energy demands .

But if organisations try to simply deploy them to do things that traditionally computers have been good at then there will be disappointing returns, and if we try to deploy them in the ways in which we have traditionally deployed technology successfully (ie, to control) we'll also find that we don't get much value in return.

People across organisations need to be given time, space and access to work out creatively where the opportunities lie, and if that doesn't happen then the resistance to change that is so well observed in the deployment of business systems doesn't look that different in the face of generative technologies than any other that has come before.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了