Excel in the Enterprises?
Dr Nicolas Figay, HDR
Let's prepare and build the continuous operational Interoperability supporting end to end digital collaboration
The recent post of Marco Wobben about some improper usage of Excel by some organisation (https://www.dhirubhai.net/posts/wobben_formula-1-chief-appalled-to-find-team-using-activity-7176510855675543552-0dPH ) raised the recurring question of Excel usage within the enterprise. But it is also the opportunity to depict some current trends and challenge related to digitalisation
Addressing Excel limitations with new architectural practices
Excel is widely used in companies due to its familiarity, versatility, and cost-effectiveness. However, it has limitations such as scalability issues, data integrity concerns, and lack of robust collaboration features.
To overcome these limitations and ensure continuous evolution and integration of interconnected applications, companies should adopt architectural practices.
This involves designing systems with modularity in mind, breaking them into smaller, reusable components for easier integration and evolution. Standardized interfaces should be utilized to facilitate seamless communication between different tools and applications. Automated testing practices need to be implemented to ensure reliability and stability across interconnected systems.
Continuous integration and deployment (CI/CD) practices allow for rapid updates and responsiveness to changing business needs. Microservices architecture promotes flexibility, scalability, and resilience. Data governance ensures the integrity, security, and compliance of data across interconnected applications.
In addition, product and enterprise modeling should be employed to ensure consistency with enterprise and product-related processes. By integrating these architectural practices, companies can create a more efficient, resilient, and continuously evolving ecosystem within their organization.
New architectural practices leading back to Excel
Implementing these architectural practices may incur significant costs for maintaining and continuously upgrading systems, especially in a complex and ever-evolving environment. This includes investments in resources such as skilled personnel, advanced tooling, and infrastructure. Additionally, ongoing maintenance and support activities can add to the overall expenses.
In a dynamic business landscape where requirements and technologies evolve rapidly, staying ahead of the curve requires continuous investments in updating and enhancing interconnected applications. Failure to do so may result in outdated systems, increased technical debt, and diminished competitiveness in the market.
Therefore, while adopting architectural practices for continuous evolution is essential for keeping pace with the changing business environment, it's crucial for companies to carefully consider and budget for the associated costs to ensure sustainable growth and innovation.
If not managed properly, the focus on maintaining and upgrading complex interconnected systems can divert resources away from addressing the day-to-day needs of users. This can lead to frustration among operational teams who are under pressure to deliver results promptly. In such scenarios, users may resort to using readily available office tools like Excel, even if they are not the most suitable or efficient option for the task at hand.
This trend is becoming increasingly common, particularly as Information Management (IM) departments find themselves overwhelmed with the demands of maintaining and mastering the complexity of various tools and infrastructure. With limited resources, IM teams often struggle to keep up with the pace of technological advancements and user requirements. Consequently, operational teams may perceive IT as a hindrance rather than an enabler of their goals.
Managing identified issue
To address this challenge, organizations must prioritize resource allocation to strike a balance between maintaining existing systems and meeting the evolving needs of users. This may involve investing in user-friendly tools, providing adequate training and support, and fostering a culture of collaboration between IM and operational teams. By recognizing IT as a strategic enabler rather than just a cost center, organizations can better leverage technology to drive innovation and achieve their business objectives.
To effectively manage the complexities of interconnected systems and meet evolving user needs, organizations rely on a diverse set of skilled resources and capabilities. These include technical expertise in software development, systems architecture, database management, and network administration. Data management skills are also crucial, encompassing roles such as data scientists, analysts, and engineers.
Project managers play a key role in coordinating and overseeing architectural changes, ensuring alignment with business objectives and timely delivery. Additionally, change management specialists facilitate smooth transitions during system upgrades, maximizing user adoption and mitigating resistance.
Strong communication and collaboration skills are essential for fostering cooperation between Information Management (IM) and operational teams. Professionals must possess a mindset of continuous learning to keep pace with technological advancements. Moreover, enterprise architects provide strategic guidance, aligning IT initiatives with business goals and ensuring the long-term scalability and sustainability of interconnected systems.
Recognizing the scarcity of skilled resources in the IT industry, organizations invest in talent development initiatives like training programs and mentorship. They may also leverage external expertise through outsourcing or consultancy services to bridge skill gaps and drive digital transformation effectively. Ultimately, building a diverse and resilient workforce equipped with the necessary skills and capabilities is essential for navigating the complexities of interconnected systems and achieving strategic objectives.
The role of visual modeling standards UML, BPMN and ArchiMate
In navigating the complexities of interconnected systems, adopting standardized system modeling languages is crucial. These languages provide a common framework for representing system architecture, processes, and interactions, facilitating communication and ensuring consistency across teams and projects.
Among the standardized visual modeling languages, UML (Unified Modeling Language) stands out as one of the most widely adopted and accurate. UML offers a comprehensive set of diagram types for modeling different aspects of a system, including structure (e.g., class diagrams, component diagrams) and behavior (e.g., activity diagrams, sequence diagrams). Its versatility and richness make it suitable for modeling various types of systems, from software applications to complex enterprise architectures.
Another notable standardized visual modeling language is BPMN (Business Process Model and Notation). BPMN focuses specifically on modeling business processes and workflows, offering standardized symbols and notation to represent activities, events, and decisions. BPMN diagrams provide a clear and intuitive way to visualize and analyze business processes, making them invaluable for process improvement initiatives and enterprise architecture planning.
Additionally, ArchiMate is gaining traction as a standardized visual modeling language specifically designed for enterprise architecture. ArchiMate provides a holistic framework for modeling different aspects of an organization, including business processes, applications, data, and infrastructure. Its alignment with TOGAF (The Open Group Architecture Framework) makes it a popular choice for organizations seeking to develop and communicate comprehensive enterprise architecture models.
Overall, adopting standardized visual modeling languages such as UML, BPMN, and ArchiMate is essential for accurately capturing and communicating the complexities of interconnected systems. These languages provide a common vocabulary and notation for system modeling, enabling stakeholders to collaborate effectively and make informed decisions throughout the development and evolution of interconnected applications.
Alternative coming from Defence and other sectors
The defense ecosystem has traditionally relied on specialized frameworks and standards to model and manage complex architectures. These frameworks include DoDAF (Department of Defense Architecture Framework), MODAF (Ministry of Defence Architecture Framework), UAF (Unified Architecture Framework), NAF (NATO Architecture Framework), and MOSA (Modular Open Systems Approach).
However, it's essential to note that these frameworks are highly intricate and tailored specifically for the defense domain. Their complexity can make adoption challenging for organizations outside the defense sector. While the principles and concepts they embody may be applicable across domains, adapting these frameworks to suit the unique needs and contexts of other industries requires careful consideration and effort. Nonetheless, the structured approach to architecture modeling and management provided by these frameworks can offer valuable insights for organizations navigating complex systems and ecosystems, albeit with a potentially steep learning curve.
Also, other industry sectors have also developed their dedicated frameworks and standards to address the unique challenges and requirements of their domains. This proliferation of frameworks across different industries can contribute to what some refer to as the "Babelization" of the digital world, where a multitude of standards and frameworks exist, each tailored to specific sectors or niches.
For example, in the healthcare industry, there are frameworks like HL7 (Health Level Seven) and FHIR (Fast Healthcare Interoperability Resources) for standardizing the exchange, integration, sharing, and retrieval of electronic health information. Similarly, in finance, there are frameworks like ISO 20022 for standardizing financial messaging and SEPA (Single Euro Payments Area) for harmonizing payment transactions within Europe.
In manufacturing and engineering, there are frameworks such as ISA-95 (International Society of Automation - 95) for integrating enterprise and control systems in manufacturing, and ISO 55000 for asset management.
While these sector-specific frameworks are valuable for addressing the unique needs and challenges of their respective industries, the proliferation of standards and frameworks can also lead to fragmentation, interoperability issues, and increased complexity, especially when organizations operate across multiple domains.
Efforts to promote interoperability and convergence between different frameworks, as well as the development of overarching standards and interoperability protocols, can help mitigate some of these challenges. However, navigating the diverse landscape of industry-specific frameworks remains a significant consideration for organizations seeking to develop interconnected systems and ecosystems in today's digital world.
Capabilities for achieving interoperability in such a context
Achieving interoperability between diverse frameworks demands a multifaceted skill set encompassing technical expertise, interoperability standards, data mapping proficiency, middleware knowledge, testing practices, and collaboration abilities. Furthermore, successful implementation often requires specialized competencies in ontology, model-driven approaches, enterprise modeling, and digital organizational architecture.
By integrating these skills and practices, organizations can prepare and build continuous operational interoperability between frameworks, ensuring seamless data exchange and integration across systems and domains.
Some new emerging technologies to consider
In addition to the previously mentioned skills and practices, addressing emerging trends such as data governance, data mesh, sovereign data platforms (e.g., Gaia-X), and leveraging emerging AI (Artificial Intelligence) and xAI (Explainable Artificial Intelligence) technologies is crucial for future interoperability efforts. Here's how these trends relate to interoperability and the skills needed to address them:
By integrating these skills and addressing emerging trends, organizations can prepare for future interoperability challenges and capitalize on the opportunities presented by evolving technologies and data ecosystems. This multidimensional approach enables organizations to develop interoperable systems that are not only technically robust but also secure, transparent, and ethically responsible.
Interoperability: an utopia?
Introducing and mastering all of these technologies and practices indeed presents a significant challenge, and achieving widespread adoption may seem utopic at first glance. However, while the task is formidable, it's not insurmountable. Here's why:
While introducing and mastering these technologies and practices may seem daunting, organizations can navigate the challenges by adopting a strategic and pragmatic approach. By prioritizing initiatives, fostering collaboration, nurturing a learning culture, leveraging partnerships, and responding to regulatory and market forces, organizations can gradually integrate emerging technologies and practices into their day-to-day operations, ultimately realizing the benefits of interoperability, innovation, and competitiveness in the digital age.
Coming back to Excel
Excel continues to hold a significant place in the landscape of technology and business operations, even amidst the emergence of more advanced tools and practices. Its versatility and familiarity make it a widely used tool for various tasks, including data analysis, reporting, and simple calculations. However, its role in the context of interoperability and emerging technologies is nuanced:
In summary, while Excel continues to play a role in various aspects of business operations, its place in the context of interoperability and emerging technologies is evolving. While it may still serve as a valuable tool for certain tasks and workflows, organizations should also explore and invest in more advanced tools, practices, and governance frameworks to address the challenges and opportunities presented by the digital age.
Conclusion
Should Excel not be used anymore by enterprises, when considering the damage it may cause to some enterprises when not properly used?
In order to respond to this question, we pointed out that Excel usage comes from some of its quality, but also from some practice issues when architecting the other pieces of the information systems to put in place.
It led us to explore the current challenges for digitalisation, in particular when having to deal with the continuous maintenance and evolution of the digital enterprise. It requires some specific capabilities, including those related to visual modeling when willing to master the growing complexity of what is to be put in place quicker and quicker. Even with the new practices, Excel will continue to play a role in various aspects of business operations, its place in the context of interoperability and emerging technologies will evolve but will have to be completed with more advanced tools, practices and governance.
Concerning continuous operational interoperability required by digitalisation, this is my expertise topic I've been exploring in my articles on LinkedIn. Don't hesitate if you want to learn more about the various needed capabilities mentioned in the article, such as standardized modeling languages and frameworks, and on how to master the underlying complexity.
Skywise Ontology Product Owner chez Airbus
8 个月Excel is simple but not efficient at big scale. Model based management is more complex, but also could bring more benefits. How far are we really from an efficient model based management of the data, the process, the enterprise (including their interoperability) ? When I say "how far", it includes the languages, the tools, the recovery on the legacy, ... but above all the change of mindset.
CIO at NHIndustries
8 个月As Excel-based applications are concerned, we have to realize that some organizations rely on Excel to develop in VBA very complex applications because this is the sole development platform at their disposal. How many organizations in the French industry are not equipped with real development environments and real trainings (especially cyber)? Too many. In the aerospace area or in the banking area, we are facing similar problems. Too many managers still don't realize/understand/accept that their staff cannot survive without developing programs. If only Excel is available as a development platform, then Excel is used. If we were to accept that programming is a required skill for the daily life of many engineers, financial analysts, etc., even if this is not listed in their job profiles and they are not flagged as "IT", we could train people to code secure programs, protect the IP of their company and protect innovation. Without the acceptation that programming is a core activity of many employees, without IT to put in place the tools and trainings for the employees to develop properly, Excel will keep on ruling and the dependencies to it will almost be impossible to remove.
President at Institut Fredrik R. Bull
8 个月The situation is desperate. I am currently contacted by several bodies asking for a way to manage simply and intuitively heterogeneous datasets for personnal or shared use -a concern we all deal with daily- . I explain them we just have 'nos yeux pour pleurer' ('our eyes to cry'). We hat nice tools 25 years or more ago. But these tools just disappeared, were never replaced. And the 'just prompt reflex' will indeed worsen the situation. That is what computer science and information systems are all about
In short, the justification for incurring short-term easily-quantifiable costs, in return for long-term hard-to-quantify savings, will always be difficult. Technical debt is thus inevitable.
President at Institut Fredrik R. Bull
8 个月Excel anywhere: we just miss a true information theory! No real progress since 1981. And indeed LLM will still postpone the need for personal interaction with corporate Information systems, data bases, data models.