The Revolution of Application Development: A look back from Pre-2000 to 2023.

The Revolution of Application Development: A look back from Pre-2000 to 2023.

?The pre-2000 era was an age of Windows desktops.?

Technology: Personal computers were becoming fast with the advent of Intel’s 8086 architecture.?

The applications or programs then resided on the desktop computer. People would create apps on desktops, compile them there, and put the entire binary on the desktop at once. Often, these were desktop-run programs like a text editor, or calculator and had to be installed on each user's personal computer.

These technologies had an impact on the way the?programmers?wrote code. The development was carried out on a desktop and for a desktop. Over the years, the code base grew increasingly complex due to the addition of further functionalities, with side-effect as getting challenging to maintain.?

When people became aware of this issue, certain best practices for managing huge projects were developed. Instead of cramming everything into a single jumble of a codebase, developers were advised to divide things into smaller modules and then build the application from these modules. These separate modules concentrated on the meager functionality.

After development was finished, this sizable code base was assembled via the release procedure into a massive binary that was then installed on desktops. Such an application with a huge code base is called Monolithic. The new modular development process did not change anything in the final product. If we revisit the last example, it was again a single large binary application (again like a text editor) that was installed on the user's computer. Nobody used to care about this. The entire process was manual.

Project Execution: There were not many complex projects. Predominantly, the teams were compact.

Unix was also used mostly for development, however, that was only in very large organizations and universities. Except for Linux, there was no other open-source project.


No alt text provided for this image

At the start of this millennium, in the good old days of Unix, the technology landscape was far different from today.

Technology: Proprietary software dominated the infrastructure and development sides of technology. There were hardly any well-known open-source projects like Linux and MySQL, and they were unable to compete with the large proprietary application enterprises due to limitations like lack of funding and the smaller developer communities. As a result, proprietary players established themselves in over 90% of places. Everything from concept through manufacture was proprietary.

I am only writing about non-mainframe space, below of examples of main enterprises.

Design - Rational Rose (Rational)

Development - Environment (Unix) - HPUX (HP), AIM (IBM), Solarise (SUN Microsystems)

  • Languages - C, C++, and Java all were bundled with Unix
  • Scripts - Shell Scripts, Perl, Python

Database - Oracle, Sybase

Middleware - CORBA, Tuxedo, MQ Series, WLS, WLE?

Other tools - Config. Control - Clear case, Defect Management - Clear quest

Microsoft was only a player in the Windows desktop market, but it was also quietly creating Microsoft C, C++, C#, and VB.

The development was happening largely on Vi editor and Shell scripts (command line) were used for build and deployment. Regular expressions, AWK, SED, and Perl were also used in many places.

This era also was marked by a new technology that revolutionized the way to build applications. That technology was Web Applications. HTML was basic for emerging Web technologies, and JavaScript was solely utilized for client programming in addition to HTML. The web applications were very simple and small.

Project Execution: Projects ran in longer cycles and used the waterfall SDLC methodology.

Almost all the activities were manual. The time to market was extremely long by today's standards because the development life cycle used to take longer.


No alt text provided for this image

Then came a moment around 2005 that marked the first change – Microsoft's entry.

Technology: Around this period, Microsoft technologies began to have a larger market presence, particularly in Microsoft-specific technology projects.

Later, development evolved and?became modular, a new mantra?to make things reusable. With the introduction of Ajax, together with JavaScript and XML, online web?applications gained a new programming paradigm designated by a user-experience designer. This was about to introduce high-level interaction. People began to move away from installing apps on the user’s system towards online applications that would be installed on a distant server machine (Unix / Linux)?someplace on the internet.

Instead of installing the application on computers, all users may now simply refer to that server and then use their web browser from their machine to see the outcome of the application's execution on the distant system. However, due to the continued usage of the same old build and release processes/technologies, the core's modularized structure was still getting?mashed up into a single binary that was delivered to the customer server. The monolithic architecture remained.

Up until 2005, Linux had become the default choice for servers due to three main reasons, First - it was robust enough to surpass the top proprietary on performance, Second - it was free (or inexpensive if you count support from businesses like Redhat), and third - the large, diligent and passionate developer communities were eager to make it even better. Several open-source software initiatives were launched in these 5 years as a consequence of the success of Linux, the high cost of proprietary?licensing, and the robust, community-driven development of open-source technology. These communities were spreading constantly, and new ideas and innovations were driving the software's usability through customization, loose coupling, and simplicity.

The future of IT was going to be completely transformed by some of these?open-source?initiatives. Already, long-standing favorites like Apache, MySQL, etc. were eroding the market dominance of proprietary software. In such a period, one of the most critical software was introduced by Linus Torvalds in 2005. That was Git.

Shortly, in 2008, the release of Google's open-source mobile operating system?Android was launched, and that triggered the?acceleration of advancing of open-source development further. The open-source community transformed from individual?open-source software/applications to an open-source ecosystem. Numerous new languages, including Clojure, Nim, Go, Idris, Rust, Kotlin, Elixir, Julia, and Elm also arrived.

This was an era of Webservices and real efforts towards the direction of distributed computing started.

Project execution: Thanks to the induction of new technologies and complicated initiatives, project execution?had gotten sophisticated. The significance of best practices, certified project managers, and efficient project processes increased. The size of the project team grew. It needed high skill to implement these huge projects efficiently.


No alt text provided for this image

Another pivotal year, the year 2010, marked the beginning of an era of the cloud.

Technology: Web applications at this point were more feature-rich and complicated. Webservices were ruling the IT world and simultaneously were becoming heavy. XML was the prime mechanism of communication between clients - servers or different systems.

AWS was ahead in global acceptance and Microsoft announced Azure. Web applications now have received the facility of load balancing and auto-scaling, thanks to the cloud. This ultimately increased the availability of infrastructure and applications. It worked as a catalyst for distributed computing. Businesses started to rush to AWS or Azure and there came a big wave of cloud migration.

Simultaneously, the open-source ecosystem had evolved dramatically, significantly increasing its share in application development and infrastructure. There was a rising and important need for high availability. It presented the next big opportunity for the open-source community. The intent was to design solutions that would decrease application downtime caused by testing, release, and deployment. That was the start of DevOps.

Continuous innovations in distributed architecture were advancing rapidly. Node.js was one of those. JavaScript, a client-side language, now had an open-source, cross-platform runtime environment called node.js. It enabled JavaScript to run on both the client and the server and thus provided a new avenue for developers to construct applications in a quick time.

Project execution:

Agile scrum was spreading its wings, allowing faster development and a shorter window to market. Teams were smaller (7-11 individuals), and work was coordinated through User Stories rather than large SRS. Jeera was the most popular requirements management tool.


No alt text provided for this image

The year 2015 was the result of the open-source community's valuable efforts in the development of new technologies for minimum application downtime - DevOps.

Technologies: DevOps implementations initiated worldwide. Micro-services were a novel method of web development that was built on a modular architecture.

Microsoft, who was the top rival of Linux, had a paradigm shift in its strategy and recognized the importance of Linux. They launched WSL (Windows Subsystem for Linux). This was proof that Linux’s dominance was undisputed and is set to grow further with other future-prospective?technologies, Artificial Intelligence and Machine Learning.

Applications such as Docker, and other new container management entrants like Kubernetes, increased the ability to swiftly transfer frequent value increments to customers while maintaining high availability.

By then, web applications were tremendously popular and had grown in enormous complexity. Furthermore, the effect of a combination of increased internet penetration and a wider internet-savvy user base, the number of users was growing exponentially. Taking care of large user bases required a larger scale, boosted infrastructure, and an increase in every other area. Databases designed to handle such a high level of traffic and data throughput were also required. In such a situation, demands like discovering anything on the Internet in milliseconds or figuring out which cabs are available to people wherever in the world would be exceptional accomplishments. Such demands had become user expectations and hence there was a necessity of the incredibly complex code that becomes harder to maintain.

The program design and coding processes were inherently based on modularity, but those were not enough even to manage the complexity during development, let alone deployment. Such programs needed to address the complexity in both the runtime or execution side as well as the development side.

Since these were monolithic apps, they had their drawbacks. Deployments were larger and too difficult. Add a new small feature to this monolithic program, it required to go through extensive testing cycles before each release to ensure no impact of the change on other functionality. This was quite an effort consuming.

The second problem with this architecture is scalability. Let us take an example of an e-commerce site. It gets very unpredictable traffic spikes. Let us say, there is a sale on some product or during holidays, people rush to the site like crazy increasing the traffic. Once the sale or holiday is over, the traffic again comes down to normal. The cloud and technology have already brought elastic servers. Thus, as the traffic spikes up, the number of application server instances increase and when the traffic goes back to normal, the instances decrease again.?

That is excellent, but consider a case where a complete e-commerce site or application is a single monolith. It has a shopping feature along with many other features. The traffic spikes on the shopping pages, the server scales up, and all the other functionalities also scale up with it even if nobody is using them. This unnecessarily requires the application to have an excess budget to spin these duplicate instances of the whole application, while, only a small portion of the application needs to scale up during the spike.

As a result of such issues, some visionary people began to create applications differently and pragmatically, such that, the functionality is divided into tiny, independent, and deployable programmatic components. This was a brand-new and superior approach that started to gain traction and opened the door to the following improvements in the upcoming architecture and technologies.

Python has been available since the 1990s and has quickly become the developer's preferred language because it requires fewer lines of code to get the same effect as other languages. However, it consumes a lot of CPU power. Again, two older technologies resurrected and were posing a serious challenge to XML's dominance. Json and Yaml were the two.

Not all requirements of every application fit with the traditional relational database. With the new variety of web applications at the time of spike in usage, the databases also needed to be horizontally scalable. This triggered the surge in the use of a new breed of databases called “No SQL” Databases were being used increasingly. Databases like these are needed for new unfiltered, unsorted data.

The above technologies would be the most used technologies for the forthcoming period.

Despite such innovations, it was still not possible to concurrently release, test, and deploy the many applications or "modules" without harming the company's operations.

Project Execution:

The teams swelled. Larger teams required multilayer scrums, which were a little difficult to handle. Better methods were required to manage larger projects or portfolios and provide the solution incrementally. Safe was an upcoming new standard becoming widely popular.

No alt text provided for this image

By?2018,?the?vision had a new?strategy?to fix old problems and it had?grown?in?popularity - Microservices.

Technology: Microservices are a novel technique in technology. Unlike previous best practices, this architecture/design breaks down the application into smaller mini-applications in such a way those applications can be deployed individually on multiple machines, communicate among themselves via the network, and collectively function as the larger application. The deployment process changed dramatically and as a result, this resolves many previous issues.

Let us use the same e-commerce site/application?as an example. They can develop a shopping catalog application with only the shopping catalog functionality?and deploy it on a different server, while another server handles the order processing. Deploy a profile application on another?server and?when?the user wants to browse the shopping catalog, the view application, which is a completely other application, sends a request to the Catalog to obtain a list of products to display, and the API delivers the list, after which the view application produces an HTML for that list. As an effect, all these small applications interact with each other across the network by accessing each other's REST API to obtain whatever they want.

This would help for easy deployment as it is a separate application, the developer adds value, tests, and applies just that application. There is no need to test and deploy others (unless there is any dependency on added functionality).

This also addresses the scaling issue. During the holidays, they can scale up only the shopping catalog application. These mini-applications are known as Microservices.

Microservice architecture does have a few drawbacks, such as architecture complexity, service discovery, and so on. We will not go into specifics.

Microservices use cloud and previously developed DevOps technologies and integrate them to deliver single, seamless, and automated application development, testing, release, and deployment. There are several tools for managing operations.?

Project Execution: The waterfall approach is rarely employed in project execution. Agile Scrum or Safe is the preferred method where work is done in ARTs and sub-teams. The scope is planned for a PI by all ARTs and delivered as the solution.

No alt text provided for this image

Today, in 2023: IT is vastly different from it was before the year 2000.

Today, open source dominates the IT sector. Automation is increasing efficiency and improving the quality of all processes. For example, code editors like VS Code, with the aid of thousands of plugins, have made it incredibly simple to build workspaces and projects. The dependencies are inserted automatically. Ever-increasingly, standard code, like import statements, including package metadata, and so on, is being generated automatically. The spacing, alignment, and readability of the code are also improved automatically. Linting extensions resolve code errors during the coding process itself, allowing developers to focus solely on the core logic. GitHub is a website and cloud-based service that assists developers in storing and managing code. It is now widely used for version control, code branching, merging, and management. All the aforementioned factors have contributed to a significant decrease in development duration.

Once the code is committed to GitHub, the release process is triggered automatically. Testcase execution followed by environment configuration along with deployment is completed automatically.

For an environment, tools like Ansible, and Terraform are used to install prerequisites and then deploy the build.?

Once the application is deployed on the cloud or different servers, CRP applications like Kubernetes help in managing containers, auto-scaling up, and scaling down without human intervention. This has immensely increased availability, and responsiveness and reduced response times.

The usage of AI and ML to reduce efforts is compounding every day.

Blockchain is also getting embraced by corporations for fast and accurate processing and results.

The Road Ahead: I believe that a tremendous amount of effort is being invested right now in automation, to automate the actions; there are millions of pull requests on GitHub daily, however, the next stage will begin in the coming years, when decisions and choices?will be automated with the assistance of AI. This will further decrease human intervention.

Even though open source has captured most of the new market in the previous 15 years, there is still significant proprietary technology in use by huge organizations throughout the world. Every day, billions of lines of code are executed. These programs will be phased out shortly and replaced?by new sleek and lean, agile, and lightweight applications. That would be another wave of huge changes waiting to flood global repositories.

After seeing this transformation, I remembered two Sanskrit Quotes:

? ??????????? ?????????? ? - The only constant is change

? ? ?????? ???????? ? - Nothing is permanent?        

要查看或添加评论,请登录

Swapnil Bhave的更多文章

社区洞察

其他会员也浏览了