Legacy Isn’t a Bad Word
The word ‘legacy’ in the tech world is often viewed as a bad word implying old, outdated systems. It conjures up thoughts of customized systems, possibly with some outdated standards or a programming language like COBAL or FORTRAN. Maybe we think of greenscreens and command line interfaces. But don’t dismiss legacy tech because it is old. At the time the system was developed, it could have been state-of-the-art and the best thing going.?
Not too long ago with most IT systems, including geographic information systems (GIS), if it was an implementation in a large organization, customizing with the tools of the day was the practical way to get benefits of automating and enjoy a large return on investment.? Many of these systems were first-generation digital systems and required converting thousands of paper documents to digital databases. With GIS, this included paper maps, paper documents, spreadsheets, and disparate databases of all types. These were monumental efforts, and in various organizations today more paper-to-digital conversion is still needed.
Several of these initial projects were ‘digitizing’ and not digitally transforming the organization. This requires business process reengineering and much of that is taking place in this second wave of moving from these legacy systems to current technology.
The very first programs that became software were 100 percent custom code. That’s how it was. There were no software libraries. The community using computers was very small and required a high level of expertise. Fast forward to today. Minimum customization is needed in most systems. For example, in local government, almost all capabilities needed can be purchased and configured without custom code, commercial out-of-the-box software (COTS). This pattern will continue – as technology matures, fewer capabilities need to be customized.
领英推荐
GIS also began as custom software and grew into commercial software used by hundreds of thousands of specialists. Then it matured into a ‘platform’ that included server, desktop and web applications, application programming interfaces (APIs) and many extensions to add specialized capabilities. This platform enabled enterprise-wide work connecting countless organizations together.
This GIS platform continues to grow and now with cloud computing and storage infrastructure and detailed global data sets included in the ArcGIS Living Atlas, this platform is maturing into an infrastructure that can be configured to many specific needs without customization. This geospatial infrastructure is developed with global standards which enables a kind of ‘mix and match,’ where you use the components, capabilities, and data that you need to solve your problems.
We give credit to this legacy tech, we are in new times. Data breaches and ransomware attacks dominate the news. The cyber risks magnify daily. Generally, legacy technology lacks the security measures needed today. Upgrading from legacy technology is the single, most powerful action an organization can do to improve cyber resilience. Keeping your organization on the current version of the technology you are using takes advantage of the new built-in security capabilities, particularly if the old technology is not supported or maintained by the manufacturer (and yes, this includes open-source software).
Legacy isn’t a bad word, it’s just tech that was developed at an earlier time. At the time it was state-of-the-art and it was cool. We should acknowledge and appreciate the initial efforts to bring the paper world to the digital world, but for many systems with security risks, it is now time to renew, update and digitally transform organizations. Getting on the path of keeping your systems current means there will be no legacy technology in your organization in the future.
MS at University of Alaska Anchorage
1 年Great article, one that gives a nod to many of the workflows and techniques that built many of the GIS features we see embedded in a multitude of products including USGS topo's and vectorized maps common on devices today. Would like to comment though that the new paradigm is built on the premise that the internet cloud services are more secure than say, a file collected on a stand-alone-device in remote locations and transferred, via SD card to a computer with security systems in place. Some cloud services managed by companies are not certified, say with FedRamp, making a situation that limits the transaction from field devices to a companies cloud solution. Are stand-alone, off-net -field solutions more secure or less secure than the cloud storage workflows?