Who Ate My Cheese?
Arshad Tanveer
Passionate and Hands-on Technologist | Ex Wells Fargo, Morgan Stanley, D. E. Shaw Group, Citi
I generally pride in my ability to answer questions of all sorts (except those posed by my wife, of course). As a practicing technologist, with professional training in Executive Coaching, and as someone who helps senior tech executives with interviews and career choices (through www.theiplusplus.com), I understand the importance and variety of questions out there. But, when someone recently asked me "how has the world of software development changed around you in the 20+ years that you have been in the industry?", I don't think I gave a good reply in the short time I had. It took me several minutes to think about it, several more minutes to articulate it well in my mind, and several days to move my lazy bum and put it in words. But, hey, here we are.
"Who Moved My Cheese?" is a wonderful book by Spencer Johnson, which of course you know about, might have read, or at least have showed it off on your desk (I have done them all). When I think about how the tech world has changed in my career, I think the proverbial cheese has not just been moved, it has disappeared altogether. What we now have on table doesn't look like that cheese at all, it is something else entirely - that how drastic some changes have been.
Alright. Lets' come back to the question:
"How has the world of software development changed around you in the 20+ years that you have been in the industry?"
Several things have changed, and I'd like to categorize them under three heads:
- How software architecture and design has changed
- How software professionals, and the ways in which we practice the craft, have changed
- How customers and their take on digital technology has changed
Advent of product management practices in mainstream software development gave rise to processes like Agile and Scrum. Companies like GE heavily promoted systems like six sigma, that had a huge influence on how teams practiced the craft. Even simple changes like moving from CVS to manage code repositories, to modern, distributed version control systems like git have had a lot of influence on developers. Dependency management, deployment management, continuous delivery and integration practices significantly changed the way teams worked. Not just development, but operation organizations changed too, moving from multi level support teams, to DevOps and SRE. Talking of teams and organizations, I think people are generally better taken care of today than in the 90s- HR practices as well as regulatory and legal frameworks have evolved. However, for the purpose of this particular article, I would like to focus on the first point- how software architecture and design has changed?
Architecture and Design
I think it was in 1998 that my boss (Srinivas Uppuluri- one of the best mentors I ever had) gave me a book called "Design Patterns" by Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. Many concepts that I read in there were eye-opening to me, and formally led me to the science of software design. While those core concepts hold to this day, the way they are applied in a world dominated by frameworks, cloud native applications, mobile apps, and progressive web apps has changed drastically. Let's look at a few changes that have happened in this area (this is NOT an exhaustive list, but just a few that come to my mind):
- Architecture and design of applications is a lot more formal exercise today, than it used to be. In other words, we discuss, debate and document various operating parameters, the SLAs (and lately the SLIs and the the SLOs), before working on the architecture. Granted that in practice, the design and the parameters per-se evolve together, but my point is that they are not an after thought anymore. And the parameters in consideration have changed too- back then, we made software for PCs with 16 MB, 32 MB RAM, and 512 MB hard drives. Most servers were in the same building. And oh, many monitors were still black and white.
- Generally speaking, in software architecture, scale and availability were secondary considerations. Function was mostly what we cared about, back then. People accepted slightly slower applications. Downtime was OK. Enterprise applications that ran batches overnight were the norm. Millions of users, terrabytes of data, nanosecond transaction processing needs were unheard of. Adding lots of data or users, often brought us back to the drawing board, once we were done throwing hardware at it. But today, you are expected to architect products that scale out, and are nearly always available- when consumer products like Facebook or Amazon go down for a few minutes, it becomes a news headline. Data partitioning, dynamic caching, routing stateless requests on server farms, concurrent processing ... you may employ any of these and more, but your design must meet, even exceed, the design objectives- including aggressive performance, scale and availability requirements.
- While creating highly modular designs was something we did back then as well, I saw it evolve to a much better place now. Especially, with adoption of concepts like event driven architecture (think of MQs, for example), interface centric design (everyone is writing APIs these days), and more standardized inter-process communication (think of web services, instead of RPCs), we take several things for granted today that had to be actively considered back then.
- Back then, I remember it was normal to be woken up in the middle of the night to a server process that died because some other service either hogged a resource, or some other team released an incompatible library without telling us (or, we missed that email last month). I witnessed how things have changed for good over the years- from use of Virtual Machines to rapid adoption of Containers (Cloud Foundry's Garden, Docker, Kubernetes etc.), to isolate and insulate applications from one another (while also easing their deployment and maintenance). So, when you adopt modular design techniques, employ scalability techniques, and do so using a container technology, you end up developing significantly powerful applications without sweating as much as we used to back then.
- Designing distributed applications has evolved extremely fast, thanks to seminal work done at companies like Google, Facebook and Microsoft. Content Delivery Networks (CDNs) are a norm these days. Some of these developments have helped processes like Disaster Recovery, where we need to maintain and manage redundancy, and quick failover mechanisms.
- Emergence of Microservices architecture is another interesting development. Back then, it was normal to encounter highly interdependent applications in an extremely complex enterprise ecosystem. Upstream and downstream dependencies were not restricted to data, but extended to libraries, storage and sometimes entire applications. I know that this is still the case in the industry, but newer thought processes in the industry promise to reduce this headache to a large extent.
- Data modeling and design has evolved too. We learnt to optimize the living daylights out of the RDBMS we employed- both in terms of data modeling and SQL design. Outside these databases, we had flat files to work with. Sometime in the early 2000s, I learnt of Object Databases and they really looked like the future of data management. Then came document stores, and NoSQL. Big Data became a thing. And suddenly, today, it is normal to encounter applications that use highly normalized data models for transactional data (when ACID properties are important), denormalized data for reporting or analysis, and document store for unstructured data - all at once. And this is great.
- I was taught, from the very beginning, that security must be baked into application design. It is not really a protective sleeve that you could shove your application into later on. I am guilty of storing passwords in config files (they were compiled into a binary, so who could read it?), of building queries that exposed us to SQL injection, among many other mistakes, that took a keen human eye (usually, of a scornful senior) to identify. I then learnt about the three As (Authentication, Authorization and Audit). Today, we don't bat an eyelid when considering Kerberos authentication in a complex, enterprise environment, or Auth0 on a mobile product, or using several static analyzers to identify potential security vulnerabilities in our codebase. But still, software security (especially from malicious threats) continues to be a cat and mouse chase, a game of one upmanship between application developers and hackers.
- The rise of frameworks, especially in the way they implemented various design patterns, has been an important development from architecture and design point of view. J2EE architecture and related frameworks really brought many of the design concepts to life. BPM products and libraries (like Apache Activiti, Pega, KissFlow, among several others), have taken the pain out of writing complex workflows. Need to manage complex rules outside your codebase? Use a rules engine like Drools. I still believe that you must architect and design your application without thinking about frameworks, or implementation details (like programming language or libraries), but these days they come to the fore much quicker than they used to.
- Talking of design, the rise of User Experience as an important science behind application development was meteoric. I remember some of the applications we built for large banks in the late 90s...a bunch of input fields, labels and buttons put together on a X Windows screen, that was still better for the users than nothing at all. I once went to a conference, where someone told me over coffee that Microsoft employs psychologists to better design their products, and I was thrilled at the idea. Today, designers conduct detailed user studies, create personas, flow diagrams (and optimize them for user efficiency over several iterations), try to keep screen design simple but effective, use colors effectively, and (most importantly) keep the users deeply involved and invested throughout the process. UX is a lot more critical and difficult these days because of explosion in the variety of devices and also because users have come to expect great experience and pleasant UI out of the box.
- Cost dynamics of software has changed over the years, and this has had an impact on architectural decisions as well. Nowadays, several IaaS and PaaS products are available to be mixed and matched, alongwith custom written code, and designers can't overlook these options. They are generally available on a subscription basis, so capital cost of building software has gone down significantly, while its operational cost dynamics have changed drastically too.
- All of this is great and we have moved in the right direction thus far. But, has something fallen behind? Yes, I think so. Today, I find that application designers are sometimes less disciplined as they care less about factors like memory and hard disk utilization; oftentimes even not bother optimizing complex algorithms. Sometimes, they are also less creative in solving problems as they don't look beyond existing patterns and frameworks. And oh, I meet an increasing number of "Java" architects, or ".NET" architects, or "AWS" architects, and less of "software" architects who consider Java, or .NET or a certain cloud technology as a tool. Architecture and Design are problem solving endeavors, and should not be seen through the limiting lens of specific implementations- they should come later.
Going forward, I think we are in for several transformations. Serverless programming is already gaining lot of traction. UI frameworks are engaged in an interesting dance of evolution. Adoption of cloud native application development is fundamentally transforming the way we think about and build software. AI and ML technologies are not only changing consumer digital products, but also improving developer efficiency (checkout TabNine, for one such development) - I am quite sure we will have software that writes software, making the job of software developer largely redundant. As users consume digital products through means that are beyond just a screen (think of smart speakers, smart watches, wristbands etc.), the art and science of software architecture will continue to evolve rapidly to embrace them. The cheese has been replaced by something else, which will soon be replaced yet again. I guess, that's life.
Note: I am sure I have missed many changes - especially because my experience has been in enterprise application and product development. There have been huge strides made in ERP, embedded systems and IOT, but I don't have exposure to those domains so I can't comment. From an enterprise application development perspective, what are your thoughts? Are there important points that I missed? Please share your thoughts in the comments below.