What Does Source Code Stand For in the AI Era?
Ilya Lebedev
Head of Vehicle Platform and MBSE, OSS and DE&I advocate at ATOM | Technology Counselor
Foreword
Recently, the software development industry has been flooded with a tide of 'AI takes our jobs' posts. Does this reflect reality? Personally, I doubt it. Let's rewind history and start from the dawn of computer science. Does anyone know how early source code looked?
Prehistoric
Let's focus on punched cards as one of the earliest physical artifacts related to programming jobs. Did you know that these cards were binary 0/1 arrays representing machine-readable data? Let me guess and assert that no one would be willing to program this way today (at least not for a living). Next came the early days of binary coding, which was as labor-intensive as punched cards. Looking back, would modern coders be happy encoding web pages and driving start-ups this way? Does it make sense at all?
Evolution
Then came assembly, a CPU-native language, which operated on bits, bytes, ports, interrupts, and other foundational computing principles. It's not about WebAssembly, which is a lower-level language that works on virtually any platform. It's about CPU (and other hardware) features where a jump anywhere could crash your code instantly. I believe most programmers are happy to have feature-rich languages.
The foundation of modern languages lies in these core concepts. Whether using C, Go, Python, Rust, or Java, programmers are far more productive than with older, less robust languages. I won't mention domain-specific languages here, but they have their own audience and are still source code, expressing the ideas behind the software.
领英推荐
Present Days
Returning to the beginning of this article, we face a massive shift from hand-crafted code to generated code. Developers are worried about potential job loss. Businesses welcome the productivity boost. Tool developers celebrate inflated investments. But wait, didn't we see the same with the rise of graphical modeling languages? They were supposed to replace traditional languages with visual programming. Low/no-code tools were meant to let businesses drive their solutions with little to no support from the programming community.
Lessons Learned
What has happened since then? History has taught us many times not to be driven by hype but to invest in change. The programming community, driven by curiosity, developed new workforces for each emerging technology. Almost nobody retired because of newly rendered innovations. Instead, businesses developed new needs for differently skilled workers. And surely, nobody cared about the drift in the definition of source code.
Here, I'd like to question the term "source code." What does it really mean? Do we have to stick to formally defined language notation? Can we identify it as a representation of the ideas behind the typed letters?
To Sum Up
I believe it's time to redefine the notion of source code itself. It's time to understand the shift AI brings to the industry and stop expecting developers to do low-level coding. It's the same shift as the introduction of OOP, functional programming, visual programming, and others. All we have to do is understand how AI helps us do more by doing less, how we can expand our knowledge and proficiency, and how to add value on top of AI-driven programming.
Solutions Architect at ATOM
1 个月And here for me is still the open question: while high-level programming will benefit from AI what will happen with low-level and close-to-hardware development? Wouldn't it may bring a new gap here? Could f.e. driver developers get assisted by AI in their work? May be yes may be not. Not sure.
What software are you wearing? I am going to the North. I'm not used to memorizing. But I'm sure I wear both.
1 个月Surelly humans will win :)
--
1 个月Yes. Indeed. If you focus on the fact that source code is a way of expressing ideas, then for code generation, these are levers for the mind that allow you to express your ideas with fewer actions.