ChatGPT won't replace engineers, it'll replace managers
To all the managers I've had: know that I care about you in this turbulent time.
Chris Smith shared an article today from The Guardian on how ChatGPT will make software engineer accessible to the masses. It's not right, but it's not wrong either. I had some commentary and at the end I suggested that I could write an essay, or just have ChatGPT write it for me.
So I did.
The question and response are below. For what it's worth, I agree with almost everything stated. There's some more nuance there but as a high level, ~1200 word summary it's pretty on the mark - especially with the areas of improvement.
The question
I'm a site reliability engineer with almost 25 years of experience, and over the last 5 years I and my contemporaries have observed a shift in programing culture and ability.
The people who built the current systems and infrastructures at companies since 2000 are older now. They're either retiring, changing careers or being promoted away from developing and maintaining those systems.
They're being replaced by younger, less experienced people. Anecdotally I've found a *lot* of these people are excited to build large complex systems because that work is interest or exciting to them... and no-one is left to tell them to build the simple, boring, stable things instead.
So many companies I've seen or spoken to have witnessed an increase in brittleness in their services recently. It's gone from complex to needlessly complicated.
Smaller companies suffered earlier from this. Larger companies have definitely caught up with the trend too.
At the same time people are suggesting that ChatGPT will eventually replace programmers. I don't believe this assertion is accurate, but I do believe tools like ChatGPT will eventually be able to reliably spit out simple code that works consistently. I'm very grateful for that. But it won't be able to help as much with the other side of the equation: the one about putting things together in simple, boring, stable ways because the people who would ensure that aren't in the right places to be effective any more.
Write an essay to convey this. Draw some conclusions on where we should be going as an industry, where we're actually going, and what course corrections we should make.
Use any examples you're aware of where complex software systems failures in the last 20 years have had a material impact on people.
Also expand, with hypothetical examples, on what real things could happen if we don't reverse this trend.
Give examples of code that AI can write faster and better than humans, but which would still need human involvement to fully implement.
Talk about what could or should happen to software engineering as an industry in two ways: if things go well, and if they don't.
Give examples of specific things which can be done or should be taught, and how.
领英推荐
The response
Title: The Paradox of Progress: The Erosion of Simplicity in the Age of Complexity
Introduction
Over the last few years, the IT industry has undergone significant changes, and one of the most notable is the shift in the programming culture and ability. As experienced site reliability engineers and developers retire or move into different roles, they are replaced by a younger and more enthusiastic generation. While this influx of fresh talent and energy is essential for the industry, it has also resulted in an unintended consequence: an increase in the brittleness of services, as simplicity is sacrificed for complexity. This essay explores the implications of this trend by examining real-world examples, discussing the potential consequences of continued complexity, and envisioning the future of software engineering based on two different trajectories.
The Allure of Complexity
Today's young programmers are drawn to the challenge of building large, complex systems. The thrill of conquering the unknown and solving intricate problems is undeniably exciting. Unfortunately, in their pursuit of complexity, these engineers often overlook the importance of simple, stable, and reliable solutions. This has led to a surge in needlessly complicated systems, which can compromise the performance and stability of the services they underpin.
One example of a complex system failure that had significant consequences occurred in 2012 when Knight Capital Group, a financial services firm, lost over $440 million in a matter of minutes due to a software glitch in their trading algorithms. The complexity of their trading systems led to a catastrophic malfunction that nearly destroyed the company.
Another example is the 2013?Healthcare.gov?launch debacle. The website, designed to be a one-stop-shop for Americans seeking health insurance, was plagued by technical issues, resulting in widespread frustration and negative publicity. The issues stemmed from a complex and poorly integrated system that failed to meet the demands of millions of users.
The Impact on Companies
Initially, this trend of increased complexity impacted smaller companies with limited resources, as they struggled to maintain the systems they had built. However, larger organizations have not been immune either. As experienced engineers retire or move into management roles, these companies are facing the same challenges, as the institutional knowledge needed to maintain simple, stable systems is lost.
For example, large companies such as Boeing have also faced issues stemming from overly complex systems. The tragic crashes of the Boeing 737 Max in 2018 and 2019 were partially attributed to the complexity of the airplane's software, which led to a series of miscommunications between the automated systems and the pilots.
If this trend continues, we could witness even more catastrophic failures in critical infrastructure and services. Imagine a future where the power grid, transportation systems, or financial institutions suffer from massive outages or collapses due to the inherent fragility of their needlessly complex systems. Such events would have far-reaching consequences on people's lives and the global economy.
The Role of AI and Automation
While AI tools like ChatGPT show promise in being able to generate simple, reliable code, they are not a panacea. The real challenge lies not in producing the code itself, but in ensuring that systems are designed and architected in a simple, efficient, and stable manner. With the industry's current trajectory, AI tools may only serve to exacerbate the issue by enabling the creation of even more complex systems.
For instance, AI can quickly generate code for a basic CRUD (Create, Read, Update, Delete) application, or assist in refactoring and optimization. However, these technologies still require human intervention to ensure proper integration with existing systems, handle edge cases, and consider potential security risks.
The Path Forward
To navigate this paradox of progress, the IT industry must acknowledge the importance of simplicity in the face of complexity. Companies need to invest in mentorship and training programs that emphasize the value of simple, stable solutions. Here are some specific steps that can be taken to foster a culture of simplicity and address the challenges posed by needlessly complex systems:
By adopting these practices and prioritizing simplicity in software engineering, the IT industry can better navigate the challenges posed by increasing complexity and continue to deliver stable, reliable services that meet the needs of users and stakeholders. The future success of the industry will depend on our ability to strike the right balance between innovation and simplicity, harnessing the power of technology without compromising the fundamental principles that underpin robust, dependable systems.
Security Engineer @ Google | Vulnerability Management + Adjunct Professor @ Sapienza Università di Roma | Cybersecurity, Ethical hacking
1 年Disclaimer, I am a manager. I have mixed feelings about this article. I love where it’s going in terms of complexity and the need to reverse the nonsensical trend we have seen (especially with SWEs [sorry SWE folks I love you, but PE and SRE are generally more cautious about this]). But, I don’t understand the connection with managers. LLMs are intrinsically and heavily biased by prompts. If you write a prompt saying managers will be rule the world, it will happily say the exact opposite of what you are trying to convey. And, even assuming everything written there is 100% accurate, you still need managers to implement these strategies… and, yes, it’s tactical work, but the strategy has been put there by human (with a prompt and with whatever this LLM has been trained on), not by ChatGPT ?? TL;DR: Avleen, you are more a manager than you think :)