The fourth abstraction is upon us
Background
There are generally three levels of software abstraction from hardware: machine code, assembly code, and higher programming languages. These are generally referred to as “generationsâ€, and since the 1970s we have largely dealt in the third generation or abstraction, with various stylistic and formulator differences, around proceduralism, object orientation, interpretation and compilation, etc. But in a practical sense, these are theological not practical debates – as all code is reduced first to byte code and then machine code.
In parallel to the 1970s dominance of 3rd abstractions, was the dream of artificial intelligence, or more accurately, machine learning – a method by which computers can shift from being mathematically deterministic, to a rather more generalized form of intelligence, or at least an intelligence that is imperceptible to the human operator – cue the worries of movies such as 2001’s HAL. ?
With the arrival of modern computing architectures, and the considerable networked computing resources of the cloud, we are now capable of executing machine learning algorithms of sufficient scale and complexity so as to simulate and/or operate many tasks that traditionally appeared to require reasoning of a sort, and thus there is a groundswell of excitement at the transformative possibilities of these textual large language models – rather than instructing the computer in its language (or an abstraction thereof) these models allow the human to prompt the computer with all the ambiguity and uncertainty of human language, and by mining the past, it is able to infer a reasonable response that will satisfy most of the human’s expectations, thereby allowing the human to proceed with further work.
These LLMs and their kin, are early in their development, and have fallacies of reasoning, hallucinations, sycophancy and other logical issues which will no doubt take a long time to resolve, and those who hold a Penrosian view may state it may never achieve a truly inferential intelligence as to be general; it is simply a mechanical Turk of such complexity and scale as to confuse the vast majority of humans, much in the same way as CGI and their poorly powered brethren filters have been able to delude, playfully at first, human eyes to not believe what they are seeing, simply because the facsimile is reasonable enough to presume it is correct.
Where are we.
We have a groundswell of contemporary understanding that there has been a paradigm shift in computing, in which the tools and mechanisms of software are in a revolution last seen in the 1970s which gave birth to the desktop computing paradigm (as it miniaturized sibling the mobile phone, once empowered by the internet).
Our means and mechanisms of interacting with computational power are transforming (as they have always done) not merely in terms of end user interfaces (such as voice prompts, touch screens, neural computer interfaces) but much more importantly within the computation model itself, shifting from a mechanical and industrial automation of the 3rd generation languages into a far more fluid, flexible and ultimately user friendly computing model led by human language.
That the modern algorithms of this new computing model are instructed in natural language has serious implications on the economics of software. There has been an economic benefit enjoyed (although dwindling) by those with the technical knowledge to work with computers due to the obscurity and complexity which now tantalizingly offers the possibility of being removed; in a world in which all humans can instruct computers, the value of the instruction declines to the point of being beyond a commodity – historical examples of this are the scribes of the middle ages being reduced by the printing press, and perhaps more recently the profession of photography has become completely commoditized to the point where it has receded into very narrow areas of professionalism that either come with specificity (weddings) or danger (war photography / photojournalism).
We are at the beginning of this elevated abstraction away from the computing machine, and the 3rd abstraction has been surprisingly durable – the first generation subsiding in a decade (the 50s), the second subsiding in 15-20 years (the 60s and 70s) and the third subsiding in 35-50 years (the 80s, 90s, 00s, 10s and early 20s) – if we were to extrapolate then, the computing utility of this new abstraction may be so potent as to potentially outlive our working lives, and therefore encountering and addressing it is vital for our long-term wellbeing in our chosen profession, and perhaps without being grandiose, in our wellbeing as humans in the 21st century.
Where we will be
Deterministic computing will not simply stop. Every mode and paradigm of computing has continued after it has been leap frogged, and it has retained some narrow value as large swathes of the world will continue to be supported by these past layers/abstractions – today our credit cards, bank accounts, insurance, health care records and other parts of our lives are administered and managed by the mainframes and technology that preceded the desktop revolution.
Similarly, in a world of the fourth abstraction, we will continue to have vast swathes of deterministic computing being performed. Critically however, unlike the centralized criticality of mainframe computing, deterministic desktop computing will most likely experience an economic heat death due to the preponderance of open-source code, and the user experience shift away from such dry or procedural systems; why use a form when you can simply express your intent? We can already see the market thirst for this, as non-computing professionals reach for their copilots and chat-ai models rather than using deterministic tools (and honestly do some of us not do the same with our coding today?).
Without being cynical, underpinning this prompted future are a few factors outside of our control:
1.???? End user desire for magical outcomes from magical machines with minimal human intervention.
2.???? The end of Moore’s law, and thus the end of classical imperative programming (i.e. the general CPU)
领英推è
3.???? The rise of business models that depend on digital serfdom, i.e. the SaaS model, to survive and thrive.
But fundamentally, there is an economic driver: software development costs too much for the quality of its outcomes on society. It absorbs capital and talent at tremendous rates, but has a declining return on this investment, as the low hanging technical fruit have been plucked, and the performance of the vast majority of software projects are economically unsustainable. Humanity is falling out of love with technologists, just as it falls in love with an abstraction completely dependent on the highest quality, highest performing technologists and not the t-shirt collar class that has benefited from the multi-decade over investment in digital infrastructure at the expense of society.
The world, in the fourth abstraction, is offered a world of software outcomes without software developers.
What this means for software professionals
So, is all lost? Absolutely not.
There remains a scarcity in the world of smart people, who are fast and effective at what they do. There remain jobs and opportunities for those capable of creating the great looms of the digital age, but not for the crafting weavers who wish to keep things as they are. Same as it ever was, the march of technology will crush underfoot those who wish for stasis and no change.
Software development will likely stratify into three separate sub-classes of work:
-??????? High order machine learning construction and operations
-??????? Mid order problem solving and production planning, management and execution.
-??????? Low order commodity prompting and error checking / training.
What is highly likely is that the mid and lower classes will be paid significantly less relative to other labor in societies than during the 3rd generation unless they are able to establish unique and salient skills or task execution capabilities which can’t be performed by non-technical staff through prompting of 4th order systems delivered by the higher class of technologists.
This is in keeping with the commodification of all software development skills through time as technology advances, but due to the universal capability of all humans to instruct a fourth abstraction system for a desirable software outcome, the oversupply of instructors results in a decline in the price of the digital good (just as was experienced by digital photography).
There is little to no intrinsic human value in digital goods, but perhaps a fully digitally native society will over value (relative to today), the work of the middle class: think of a the digital equivalent to artisanal coffee from a barista, or sour-dough bread at the farmer’s market: all vastly more expensive than the inputs, because of the qualitative appreciation of the craft in the construction of the end product. I highly doubt this will happen – there is no evidence that any digital consumer cares if their site is made with WordPress, Squarespace or hand coded HTML – as the digital output is fungible.
In reality, the value of software produced by people will in aggregate decline, and software as a product will therefore be bought for less in the market, as more (if not all) people can produce reasonable software outcomes without the work of a developer.
The consequence of this, is that the economic utility, and therefore the price of all but the most unique/valuable/talented/specialized/capable developers will progressively decline through time, at first reaching more reasonable parity with other white collar workers, and then joining in a generationally decline as work in law, medicine, accounting and other knowledge work declines in value through automation.
QA Developer and Technical Project Manager
11 个月A very insightful well written article Andrew La Grange. I would like to also ask what this paradigm shift means for software testers, analysts and quality engineers?
Executive Director: The Foschini Group Limited Board
11 个月Great article Andrew … indeed a paradigm shift in the role of traditional coders is on the horizon … time to retool!