Software 3.0
Credit: Matt Welsh

Software 3.0

We studied Software 1.0 in school - traditional software development where humans write functions using a high-level programming language.

Andrej Karpathy coined Software 2.0 for machine learning - where machines learn the function programmers can't write - from data.

Now, we are entering an era of Software 3.0 where a machine running large-language model can write Software 1.0 code!

This weekend, I listened to a fantastic talk albeit a bit-scary and a bit-futuristic by Matt Welsh at Harvard. He portends big changes to computer science & software engineering due to LLM's and I think we (software folk) should pay attention.

Link: https://www.youtube.com/watch?v=JhCl-GeT4jw


This is roughly the structure of the talk and some main points.


1. Starts with goal of Computer Science (CS)

It's a discipline for computer programs - implemented, maintained & understood by humans.

2. Makes the case why Computer Science failed (a glass half-empty perspective)

CS has been trying for the last 60 years (FORTRAN was 1957) to make software engineering easier with new languages, debuggers, static analysis, documentation etc but programming is still challenging to write, debug, understand & maintain. Learning software engineering is expensive & therefore software engineers are expensive.

3. Shows the alternative

Matt shows examples how one can use plain English to accomplish feats that took days if not months (depending on individual skill level) - using code copilots today.

The below slide compares cost to generate 100 lines of code every day, today.

Killer slide of the talk - this is the economics today!


4. Infancy of the field

It's a bit of a dark art today to be effective at model-prompting to do complex tasks. It's very important to use techniques like using capitalization (CAPS) for emphasis, saying "think step by step" to accomplish computational goals. Can we make this more Engineering than Art ? This is one area of focus for many start-ups in the field.

There are also scenarios when the model fails and it's important to be able to address those deficits through additional training or other capabilities. This is another area for many start-ups in the field.

5. Justifies trajectory for growth & capabilities

Neural network scaling law papers have shown effectiveness of LLM's will keep improving as long as we scale compute & data.

If neural network scaling laws hold, then we may get to a new kind of computer (like in the movie Her). We simply ask the computer in natural language & the computer does it (of course it may access the web, talk to other AI's, use libraries/apis, use personal data, write a program as an intermediate step etc to accomplish the goals).

A new computer

6. Potential long-term effects on CS

In this new world, software is not intended for human consumption or maintenance. In such a world, do we care as much about modularity of code, proper abstractions, documentation etc ? What happens to CS department in a University ? Matt says just like CS originated from EE & Math into a separate entity and that didn't cause them to be obsoleted, likewise CS will continue to be a foundational building block, but there will emerge a new kind of engineering to build Software 3.0 which is quite different from traditional CS.


I thank Matt Welsh for such an articulate talk. I agree, if this is how we democratize programming to all humans at the expense of relegating CS, so be it.

Sam Velu

Expert/Leader in Complex System Design - High Speed & Power + Mixed-Signal. Xfn teams

1 年

Hope there will be a career and role for future SW engineers, universities drastically change syllabus, but growing headwind is clear.

要查看或添加评论,请登录

Arun Kannan的更多文章

  • TED 2023 - AI talks

    TED 2023 - AI talks

    I watched 4 TED talks on GPT4 from TED 2023 conference that happened in April 2023 in Vancouver, Canada. The first two…

社区洞察

其他会员也浏览了