Exploiting Technical Debt With Modernization and Rediscovery
When you think of paying technical debt, what comes to mind? Many of us may envision patching servers, updating older components in applications, or maybe migrating to a newer version of an API. These activities are good. However, executing what could be considered a mundane operational procedure, like updating application dependency versions, has the potential to be more.
This post will explore unexploited opportunities for modernization that pay dividends across multiple facets of the development and operational lifecycle. Specifically, I’ll share an example that I recently completed that turned out to be much more than just an exercise in component updating.
Background on the DirectProject specification and reference implementation
First, some relevant history…I'm the unofficial chief architect for a national health IT open source reference implementation for a specification called the DirectProject, and I contribute almost all of the code. The specification itself isn’t the most interesting (it’s essentially a secure email transport with some additional requirements building on a protocol that’s over 30 years old). But it has its place in today’s world of health data sharing.
Along with a few other developers, I started this reference implementation in mid-2010. At that same time, the specification was being developed in a do-ocracy (i.e. ruled by those who do the work) approach to standards writing. The underlying platform libraries considered were Spring and Google Guice. Personally, I was a card-carrying Spring basher at the time, and my consideration was anything but Spring. In the end, due to the division of work, my colleagues ran with Spring, and I went the opposite direction as fast as possible.
Enter Spring: Reluctantly at first
Moving forward a couple of years, the project lost the other contributors due to prioritizations in their own companies. This left me with ownership of the entire stack, top to bottom, and forced me to grudgingly embrace a technology that I despised. Admittedly, being forced to use Spring did give me a bit of a new perspective, but in 2012, I still wasn’t sold. Declarative configuration was appealing, but I found Spring too verbose and hard to follow. Don’t get me started on embedding Spring xml across a slew of multiple jars just to complete a single configuration. I did what I had to do to maintain their code but little more.
Fast forward to 2016, and I had an opportunity to move to a cloud-native architecture. I was introduced to Spring Boot and discovered all that had been innovated and improved in the Spring framework. Remember that my view of Spring was of its state from nearly a decade ago; this was a whole new world. Sprinkle in Spring Cloud and some other upcoming and recently established initiatives, and I was quickly all in. I embraced Spring so hard that in under a year I had gone from a lifelong detractor to an evangelist and speaker at numerous conferences and meetups.
So back to this reference implementation. I had wanted to update the components of the DirectProject. After all, many of the libraries were approaching a decade old. At a recent developer’s conference, I finally found the motivation (and time) to start the task. Where to start? The logical choice was to start at the bottom of the stack and work my way up, updating pom files (maven speak) with the latest library versions and making the appropriate code changes as needed. With decade-old libraries, some code is bound to need some changing.
Now, I had two years of hard-core, modern Spring development experience with an arsenal of solutions that replaced older techniques. As I began working, it became blindingly obvious, even at the bottom of the stack, that there was an opportunity to reimplement using an updated Spring Framework. I found myself quickly wiring together much simpler solutions and deleting thousands of lines of unnecessary code that had been replaced by mature frameworks. This same pattern repeated itself throughout the stack as a multitude of technologies and implementations were replaced (or I should say deleted) by newer boilerplate libraries. In the end, not only did I have a completely updated set of libraries and dependencies, but also a contemporary architecture that simultaneously supported the DirectProject’s legacy deployment approach and a fully cloud-native (12 factors and all) paradigm.
I said this case held multi-faceted advantages above and beyond just paying tech debt. If you do any study on cloud-native architectures, you will quickly learn there is much more to it than the code. Cloud-native architecture encompasses new approaches to software development, continuous integration and continuous deployment (CI/CD), devops, cross functional teams, foundational changes in corporate culture, just to name a few. There are ways to measure improvements gained by each of these facets, and each has its own interesting dynamics. More on these topics in future posts.
Does this all sound familiar to you? I’ll bet you can identify several areas overdue for a refresh. The short story is that paying down technical debt can lead to numerous unintended (or intended) advantages if you take, and have, the time to approach it as a modernization effort.
VP, Marketing | Platform Advocacy | DevRel | Product Marketing | Observer of the Tech Industry
6 年This is such a great story. Jeff Kelly?and I also had a chance to interview Greg about this for our podcast:?https://content.pivotal.io/podcasts/improving-healthcare-data-interoperability-with-cerners-greg-meyer