Architecture in times of rapid technology evolution - Part I
Yatin Kulkarni
Enabling Digital Transformation across the enterprise via process and technology disruptions.
Technology advances were driven by “academically sound and battle tested” architectural principles - Now it's the opposite - Architecture continually chases new technologies, often with disregard to the revision of the aforementioned principles.
Architecture, as an academic discipline, is the study of the design and construction of buildings and other structures. It encompasses a wide range of fields, including history, theory, technology, and practice. The discipline seeks to understand the relationship between buildings and their social, cultural, and environmental contexts.?
Obtaining a bachelors degree in Architecture typically involves a blend of theoretical and practical subjects spanning art, mathematics, geography, material science, mechanical concepts pertaining to load distribution and, finally, civil engineering . Students may delve into the history of architecture, exploring significant movements, styles, and influential figures. They may also study architectural theory, examining concepts such as form, function, and aesthetics. Additionally, architecture programs often include design studios, where students gain hands-on experience in creating architectural plans, models, and drawings. The academic study of architecture is essential for developing skilled professionals who can design and build structures that are both functional and aesthetically pleasing. Architects play a crucial role in shaping the built environment, and their work can have a profound impact on communities and society as a whole.
While architecture is most commonly associated with the design and construction of buildings and bridges, its principles and practices extend to a wide range of fields. Here are a few examples:
These are just a few examples of how architecture can be applied to fields beyond buildings and bridges. The principles of design, functionality, and aesthetics that are central to architecture can be applied to a wide range of creative endeavours.
Just as traditional architecture shapes physical spaces, software architecture provides the blueprint for digital systems. It defines the high-level structure, components, and interactions of a software application. Good software architecture is essential for building scalable, maintainable, and reliable systems.
As defined in Software Architecture in Practice (2nd edition) by Bass, Clements, and Kazman: “The software architecture of a program or computing system is the structure or structures of the system, which comprise software elements, the externally visible properties of those elements, and the relationships among them. Architecture is concerned with the public side of interfaces; private details of elements—details having to do solely with internal implementation—are not architectural.”
Grady Booch, Philippe Kruchten, Rich Reitman, Kurt Bittner, et al offered an alternative definition:
"Software architecture encompasses the significant decisions about:
Software architecture is not only concerned with structure and behavior, but also with usage, functionality, performance, resilience, reuse, comprehensibility, economic and technological constraints and tradeoffs, and aesthetics."
Thus, key concepts in software architecture include:
领英推荐
The term “computer” comes from the Latin word “computare”, which means "to calculate," "to count," "to sum up," or "to think together". "Computare" is a combination of the Latin words “com-”, which means "with," and “putare”, which means "to reckon" or "to think".? The word "computer" was first used in the 1640s to refer to someone who calculates, and then in 1897 to refer to any type of calculating machine. In 1936 Alan Turing invented a hypothetical computing device that came to be known as the “universal Turing machine” which ultimately led him to develop the Bombe which was instrumental in turning the tide of World War II. Post the war, Turing’s design for the Automatic Computing Engine (ACE) was the first complete specification of an electronic stored-program all-purpose digital computer. Through the rest of the 1990s, the term began to be used to describe a variety of programmable digital electronic computers for business computing as well as for scientific computing such as digital signal processing and image analysis.
Interestingly, the foundation of all modern computing are Applied Mathematical constructs that allowed humans to use Newtonian Calculus to solve real world problems to a level of precision that enabled humans to land on the moon, to diagnose medical conditions through digital imaging, and to redefine financial products and services based economic theories from Nobel Prize winners such as Dr John Nash. These mathematical foundations of Computer Science have helped shaped the software architecture concepts described above as follows:
Business computing obfuscates these mathematical foundations of computing and allows “mathematically challenged” software engineers to use simplified Data Structures and Algorithms to develop code that is primarily meant to digitise and automate processing of non-scientific information in corporate value streams. As business systems evolved to match the enterprise architecture of large corporations such as business units with independent profit and loss accounts and shared corporate functions such as Finance and Human Resources, enterprise software architecture evolved to include common “enterprise” architectural patterns such as:
Lower level architectural patterns such as the “Gang-of-four” patterns which served as the twenty three commandments for developing robust and extensible application architectures were replaced with complex patterns “atomised” for distributed and eventually cloud native computing culminating in the “Well architected” frameworks for cloud computing promoted by all the major hyperscalers, especially Amazon Web Services and Microsoft Azure. Today, choosing the right software architecture depends on factors such as the application's requirements, the development team's skills, and the project's constraints rather than a through understanding of the underlying mathematical concepts that will allow for the problem at hand to be decomposed and solved in the most efficient manner. The only measure of “a well-designed architecture” is the on-time and on-budget delivery of a software project, while a poorly designed architecture can lead to technical debt, increased costs, and decreased quality.
Similarly, in the realm of Machine Learning and Classifier Design the use of mathematical methods such as Principal Component Analysis, Eigen Vectors, and even good old Linear Regression to reduce the dimensionality of large data sets so as to determine features that result in optimal boundaries in high dimensional space allowing us to discover and differentiate between different data sets has been replaced with a brute force “Trial and Error” approach wherein a large dataset consisting of labelled and unlabelled data is simply “processed” using a multitude of models ranging from “K Nearest Neighbour” to “Bayesian” Classifiers to allow the data “engineer” to simply pick the “best performing” model thereby negating the need for data “scientists” with PhDs from prestigious and semi-prestigious institutes of higher learning. Business Executives can simply pick a model and the associated analysis that helps them make the best case for allocation of financial and other resources to “pet” projects. Even venture capitalists have chosen to rely on making “thematic” investments instead of “wasting time” on truly understanding the technology that they are pouring billions of dollars into.
Today, the new kid on the block is Agent Based Architecture, wherein each “agent” is a custom trained Large Language Model that can use natural language based data structures to communicate and coordinate with other agents under the premise that the resulting inter agent “chatter” would be easy for us punny humans to understand and agree or disagree with. Yet again, the union of “mathematically and historically challenged, yet entitled” software developers is very eager to sweep the challenges posed by the verbosity of XML based SOAP-RPC and SOAP Documents based inter “service” communication that necessitated the return to “binary” data structures such as gRPC (encoded and zipped JSON) to deliver the right cost to benefit ratio that justified the reduction in the human workforce that facilitated information exchange in the mega corporations of the early twentieth century.
Large Language Models (LLMs) are simply Mathematical constructs that are extensions to the Deterministic Finite Automaton - a finite state machine that accepts or rejects a given string of symbols, by running through a state sequence uniquely determined by the string - invented by Warren McCulloch and Walter Pitts in 1943 possibly combined with Markov Chains and Hidden Markov Models based on the work of the Russian Mathematician Andrey Markov in 1906. Even the specialised version of LLMs, the Generative Pre-Trained Transformer that is touted as a computing revolution of the twenty first century was first proposed in 1992.?
It is only because the manufacturers of computing hardware were so desperate to overcome the demise of Moore’s Law that obscene amounts of computing resources and energy are being applied to train these LLMs so that “ordinary” human beings can entertain themselves in more creative ways and large corporations can once again find reasons to downsize the human workforce so as to squeeze out more profits and benefit the large institutional investors disproportionately as compared to the average retail investor. The true benefits of using these LLMs in solving simple business problems that were more than adequately being solved by simple deterministic machines might just remain a dream.?
To be fair, scientific computing required in drug research, medical diagnosis, and simulating complex systems such as weather and nuclear fusion does benefit from the massive increase in parallel Floating Point Operations made possible by the modern GPUs and other custom built hardware, but that is hardly new news. The use of Quantum Computing to solve the class of NP-Hard problems is yet another frontier in computing that we can aspire to cross in the near future. But the indiscriminate use of LLMs in futile attempts to make business computing more “intuitive” and to allow for “serendipitous” discoveries in the vast unrefined reservoirs of unstructured enterprise data should be reined in. Well established mathematical models for “scientific” analysis of unstructured data have existed for over almost fifty years and the increase in computational power and fast storage will allow for this rather “boring” and “white box” mathematics to be applied at global scales.
In conclusion, Enterprise Software Architecture - spanning people, processes, and technology - in these times of rapid technology “pseudo-innovation” needs to be tempered by some good old academic discipline and rigour that has always proved valuable since the advent of number systems and arithmetic a few millennia ago.
,, machine food Graduation
6 个月Bachelor architect engineer complete degree certificate Mumbai Maharashtra madh, Suraksha smart City office design
Enterprise Architect
7 个月It is a very engaging intellectual discourse. Thank you very much, Mentor?Yatin Kulkarni!