Is it time for a new programming language?
Image generated by DALL-E, an AI developed by OpenAI, on 24.07.2024.

Is it time for a new programming language?

I'm old enough to see the introduction of a new programming language as one of those few-in-a-lifetime experiences. "Turbo Pascal - WOW!" I couldn't wait to read the next issue of Dr. Dobb's Journal! Today, you'd need a daily blog just to capture the new languages and frameworks from last week.

But after my recent pair-programming session with ChatGPT I'd like to ask: Are we ready for a new programming language, one tailored to the needs of AIs and LLMs -- collectively called genAI? I don't know, but here's what I am thinking.

The approaches of yesteryear

The 0th, 1st and 2nd generational languages were all about communicating to the machine so that the machine could understand. The first "programmers" moved wires on the "pinning boards" of tabulating machines. Later, the programmers in the mainframe "Age of Iron" used 0's and 1's (which my Dad actually did for a few years) or later assembly language. These cases were tailored to specific hardware.

Then came the 3rd generational languages like COBOL and FORTRAN. These cases were tailored to a human/hardware mix, with elements like DO… for us and elements like WRITE() for reel-to-reel tape or env for operating system variables. They also introduced a degree of abstraction, so that one program could run on different systems. Java and Python are in this category, and today we are living in the Framework Explosion: Vue.js and Angular and the hundreds of others are designed to make humans' lives easier: nobody asked it, but my guess is the JavaScript engine doesn't care.

Then came the 4th generation languages like SQL. These cases were tailored to humans.? SQL was introduced in 1970 but it remains the gold standard today. Some argue that the new low-code/no-code platforms are in this category.

All these languages and frameworks have one thing in common: a human is in the driver’s seat, and the languages and frameworks are designed to help humans. Humans need to have things that are easy-to-remember, easy-to-use, and easy-to-read. Probably this is why some people prefer Python over Java: tabs are easier to mentally digest than {‘s and }’s.

Time for a new language?

Consider the world today. We feed human-engineered code like Python into an genAI (LLM) and ask it to do something with that code. It uses its super-brain (which despite many claims, probably nobody really understands) and it spits back code written for humans. The process looks like this: human-readable code -> super-brain machinations -> human-readable code. Where is the sense in that? It gets even worse when we expect the genAIs to know the in's and out's and bugs of the frameworks, designed to make our lives easier, not theirs!

If we project this situation forward – even just a bit – its ludicrousness becomes self-evident: humans asking LLMs and genAIs to create human-readable code for humans that no longer need to read the code!

There are still a few aspects that haven’t changed: abstraction, so that the code in question runs on all the common hardware platforms, and so that all genAIs can work on the code; testing, so that humans can be sure the code does what we expect; and maintenance, because all applications evolve over time. This latter constraint may be the most challenging, since maintenance requires capturing and recording all the design considerations. Human understanding is an additional constraint, i.e. can a human understand what the code is trying to do? But this is a constraint that I believe disappears if we trust genAis to explain back the code to us.

How would a “made-for-genAI” programming language look?

So if the next generation of ChatGPTs will write code, not humans, perhaps it’s time to start thinking about the ideal languages to make their jobs easier, while still respecting the constraints?? And what better way to start then by asking an AI how it feels about this topic, and what it would prefer in such a language??

I asked ChatGPT to give some design considerations for such a language and, based on these, some specific characteristics of the ideal language for an AI to write applications. It gave a long list of technical things like simplified syntax (do we really need different ways to loop?), high level abstractions for common patterns (do we really need to implement another login dialogbox?) and many others.

But one stood out for me: explicit context management.

A chance to incorporate some missing ingredients?

A significant challenge today is that our "context" (in more common terms, our requirements) is separate from our "implementation." There were attempts (e.g. model driven development, RationalRose etc.) to couple business artefacts with generated code, but sadly these never caught on. As anyone confronted with modernising legacy code knows -- or especially, re-engineering mainframe code -- converting between languages is easy but answering the questions "what were these guys thinking?" and "why did they do that?" is hard if not impossible. The world today is still powered by Excel macros written by people long retired and that nobody else dare touch.

It seems to me the advent of a made-for-AI programming language would give us the chance to focus more on context and ensuring the requirements are clear and tightly coupled to the code. I can't even dream about how this would look, but I can envisage a scenario. Ken: "can you delete that save button because nobody uses it anymore?" LLM: "Yes, I could delete it, but someday there may be situation XYZ and then this button will be necessary. Please confirm, and I will update the context and documentation accordingly."

Does this differ from the "normal" of today?

If these changes are coming, I am sure they will come fast. And I'm also quite sure many will immediately think "human inputs context, AI writes code" is a fantasy. But given the speed at which I've seen AI and LLMs come into their own, I am a believer. I can already envision the future exams I give to my my computer science students: "Insert the UML diagram that ChatGPT produced here: _____" Or "List three areas that ChatGPT believes are weak in the design, and give the recommendations it has to improve them: ____"

Is this really no different from relying on calculators instead of knowing the process of long-division?


Reena Bagga

Project Manager HR @ Brink's Inc

3 个月

Acknowledge the perspective on AI tools like ChatGPT transforming programming. While AI’s role raises questions about language design, human-friendly frameworks remain crucial for collaboration, maintainability, and learning. Instead of creating new languages, we should enhance existing ones to better integrate AI, fostering a symbiotic relationship between humans and AI.

Matthew Anderson

Innovative, Internationally Experienced Digital Manufacturing / Smart Technology Expert

4 个月

Reminds me of Steve Jobs' quote on reducing complexity (sure he wasn't the first to say it, either): "The line of code that's the fastest to write, that never breaks, that doesn't need maintenance is the line you never had to write." GenAI is now taking this to the next level, surpassing high-level languages and frameworks by transforming complex programming tasks into natural language interactions.

要查看或添加评论,请登录

Kenneth Ritley的更多文章

社区洞察

其他会员也浏览了