I predict that software quality is just about to go down!
DALL-E generated image "illustrate non-sense programming methodologies"

I predict that software quality is just about to go down!

Note: this article is best understood by a software programming audience.

As some voices are rising to suggest that the world will no longer need programmers (IMO/In-My-Opinion, a total misinterpretation of what's going on in software development), I realize that the title of this article will sound controversial. Let me argue my viewpoints.

In the early days of my career, I was an avid programmer using “assembler” coding language that is as the most basic level of how one can program processors. Ever since, the world of computer programming has evolved with ever more abstraction and programming paradigms. Abstraction means that with fewer amount of line of code, you are supposed to achieve a lot more functionality or quality code. Programming paradigms mean that by using specialized language, you can generate more concise pieces of software that are within a class of problems. A good example is the HMTL web program language which is good for programming basic web pages but had to evolve to build more complex/dynamic websites (the javascript language is an example of another language that went to extend the limitation of HTML).

A large reason for the need to evolve programming paradigms, tools and languages is the sheer size and complexity of software.

To illustrate this, if one were to print the source code of Windows 11, it would extend the distance between Earth and the moon.

Let's debunk a few things

Ok, I'm passionate about this topic so let me debunk a few things for my programmer friends and anyone interested:

1/ "Agile/scrum methodologies are the best"

No, there are not and the reason I am saying this is that they are often used to accelerate the convergence between ill-defined use cases, and ill-defined implementation strategies and are merely good enough to allow regression tracking of bug closures which program managers love (the definition of a bug need to be properly defined as it is often a metric of programmer's performance which is largely misleading when no good product spec exist).

Before applying any "agile" methodologies, get yourself a strong software architect. No need for a UML guru but a down-to-earth person who can start articulating how he would devise a large software into modules and what basic API definition would make sense for these modules to effectively take advantage of another for the core use cases.

Proper software architecture must come before the agile regiment and I am missing the "punch-card" programming days where you had 1 hour to experiment with a paper card (as a means of storing experimental programs) on one of the few mainframes that had enough compute capabilities to accommodate multiple programmers. Agile/scrum would certainly have been of no help back then.

2/ "Generative AI will soon replace the need for programmers"

Sure:) Yes, but ... No. ChatGPT works when you ask to generate a basic program in Python, C++ or any other language. "generate a basic 'hello world' example" and, for sure, the proposed code will be syntaxicaly perfect the first time around. Yet, as soon as you ask for something complex, with the integration of sophisticated algorithms - what you get is a mere framework with function calls with no nuance or understanding of how sophisticated algorithms may integrate or be tuned. If you are a roboticist, just try asking "chatgpt, generate a robot navigation program using SLAM to implement a grid occupation map" and what you get is going to get you nowhere.

3/ "Abstraction or the use of standard libraries is essential"

Well, yes and no for abstraction, and definitely Yes on libraries.

I explain.

C++ went quite far in abstraction (for the purist, I do like the concept of class inheritance, polymorphism, operator overload, etc...) but look what happened with Android. Half of your time programming was spent on "boilerplate" redefinition, instantiation, or specialization of legacy classes to the point that 20% of an Android app was made of useful Java code. Google realized this and came up with the Kotlin language. Similarly, Apple went on the same path and corrected it via its Swift programming language.

I must admit that the C++ design pattern philosophy was a good thing and coincided with the emergence of UML (universal programming language). The idea was to come up with a bare-bone set of algorithmic libraries able to solve of the stuff that most programs are going to have to tackle (sorting data, manipulating tables, strings, matrices, trees and all kinds of data objects). The same happened in math with the "Numerical Recipe" book/bible where a slew of standard mathematical problems were not only explained but pre-programmed solutions made available in several programming languages.

So yes, software libraries (consisting of a catalogue of reusable software modules) help but do not underestimate the time it takes for a programmer to learn libraries and more often than not, it's faster to program a small library module and properly document it than trying to leverage an existing library.

4/ "Python and other scripting languages aren't designed to deal with compute-intensive scenarios"

Wrong. Python for example leverages calls to highly optimized pre-compiled libraries (often written in native C or C++ like ROS, OpenCV, Tensorflow, etc...). Most recent AI programming frameworks (Keras, Pytorch, CUDA, OpenVino) come with various libraries and sometimes language extensions to convert a compute and data dependency graph model into optimized execution on a form of specialized CPU, GPU or dedicated processor). It means that, once you explain how to throw a massive amount of data at a problem using a slow-processing language, some underlying optimized libraries, with proper guidance, will make sure that the hardware is used to its maximum potential.

5/ "The concept of 'copy/paste' programming was already there with advanced IDE (Integrated Development Environment)"

Yes and let me start by saying that when I see junior programmers building an Android app for example, there is no doubt in my mind that most of the value-add is to copy-paste code snippets found on the web to assemble software modules. We are far from understanding the root of programming like the O() complexity notation to quantify the effectiveness of algorithms and part of the reason why code quality is going down. I also do not believe that the massive trend of outsourcing software in low-cost countries has helped in any way to make better software (and not even cheaper software).

On IDEs (tools you use to edit, compile, and test software), there has been a lot of good things being done across the ecosystem (Android, Apple, Eclipse, Microsoft). Call it AI or not, but tons of improvements were made across tools to navigate code, find syntax issues even when code is compiled or even alter code while a program is still running.

In the embedded space, I want to mention Texas Instruments with its code "Composer Studio" which had deep down roots in not only instrumenting software but the underlying behaviour of processors via what became a standardized "JTAG" interface.

6/ "open-source is free and Linux is superior"

AhAhAh and to the Linux fanatics that surround me, I am going to say this again: while anything opensource seems to be free but it is not and certainly not free of intellectual property risks (read my book). Linux has pros and cons. I like the pros and will forever be frustrated with the cons as I don't see them changing. Try recompiling any kind of complex code and you will end up having to deal with multiple methodologies: installation packages (conda, wheel, pip, etc...), various incompatible versions of libraries and scripting languages, unsure driver support, etc...

As a matter of fact, in every project I see, teams are spending at least 20% of their time trying to maintain the stability of code (and for the purist, yes, I know about .venv, docker and other virtualization technologies).

For sure, Matlab, Maplesoft and other great software I like work perfectly on either Windows or Linux and you know why? Because there is a commercial company behind, committed to maintaining and supporting the proper functioning of their offering on multiple platforms.

Now, any pain point is a business opportunity and when NVIDIA is saying they will pre-integrate and validate all kinds of Linux none-sense in their cloud API - that is music to my ears.

For now, I am also happy to pay for my Windows OS license.

Conclusion

They say "AI" will replace programmers -> well good luck because that chatgpt engine for sure needs a lot more curated inputs, programmers' reviews and, BTW, who is going to pay for the GPU-farm energy and infrastructure?

Software is growing to be recognized as being central to many industries, the same way processor technologies are now recognized as being a sovereign asset.

Processors don't do much without a program and the world of programming methodologies don't currently seem to reflect much of business realities apart from the traditional opensource or outsourcing game.

I see 3 scenarios:

  • companies go all in with AI and let's see what happens
  • companies and their consultants continue to play the game of: we'll make sure it works for a given use case
  • VCs or investors will continue to sustain their investment by saying: we've seen it work - it's a matter of integrating and customizing the inferencing engine in people's workflow


Well and please mark my words: most of these scenarios will involve programmers with a deep sense of underlying technologies to fix or improve machine-generated code, as a result, this will generate one of the largest waste of human brain power we are going to see coming.


Cheers


#programming #language #abstraction #generativeai #codequality


As always, feel free to contact me @ [email protected] if you have comments or questions about this article (I am open to providing consulting services).

More at www.lohier.com and also my book. Here to subscribe to this newsletter and quickly access former editions.

Fabien Diakhate

Enterprise Solutions Practice Lead

8 个月

Frantz, thank you for another great article! I love how you articulated some of the challenges for AI-ChatGPT, the not so "Agile" methodology and the complexity of leveraging those ever-growing libraries! Higher software quality will not be so simple to achieve, that's for sure!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了