Excel in the Enterprises?

Excel in the Enterprises?

The recent post of Marco Wobben about some improper usage of Excel by some organisation (https://www.dhirubhai.net/posts/wobben_formula-1-chief-appalled-to-find-team-using-activity-7176510855675543552-0dPH ) raised the recurring question of Excel usage within the enterprise. But it is also the opportunity to depict some current trends and challenge related to digitalisation

Addressing Excel limitations with new architectural practices

Excel is widely used in companies due to its familiarity, versatility, and cost-effectiveness. However, it has limitations such as scalability issues, data integrity concerns, and lack of robust collaboration features.

To overcome these limitations and ensure continuous evolution and integration of interconnected applications, companies should adopt architectural practices.

This involves designing systems with modularity in mind, breaking them into smaller, reusable components for easier integration and evolution. Standardized interfaces should be utilized to facilitate seamless communication between different tools and applications. Automated testing practices need to be implemented to ensure reliability and stability across interconnected systems.

Continuous integration and deployment (CI/CD) practices allow for rapid updates and responsiveness to changing business needs. Microservices architecture promotes flexibility, scalability, and resilience. Data governance ensures the integrity, security, and compliance of data across interconnected applications.

In addition, product and enterprise modeling should be employed to ensure consistency with enterprise and product-related processes. By integrating these architectural practices, companies can create a more efficient, resilient, and continuously evolving ecosystem within their organization.

New architectural practices leading back to Excel

Implementing these architectural practices may incur significant costs for maintaining and continuously upgrading systems, especially in a complex and ever-evolving environment. This includes investments in resources such as skilled personnel, advanced tooling, and infrastructure. Additionally, ongoing maintenance and support activities can add to the overall expenses.

In a dynamic business landscape where requirements and technologies evolve rapidly, staying ahead of the curve requires continuous investments in updating and enhancing interconnected applications. Failure to do so may result in outdated systems, increased technical debt, and diminished competitiveness in the market.

Therefore, while adopting architectural practices for continuous evolution is essential for keeping pace with the changing business environment, it's crucial for companies to carefully consider and budget for the associated costs to ensure sustainable growth and innovation.

If not managed properly, the focus on maintaining and upgrading complex interconnected systems can divert resources away from addressing the day-to-day needs of users. This can lead to frustration among operational teams who are under pressure to deliver results promptly. In such scenarios, users may resort to using readily available office tools like Excel, even if they are not the most suitable or efficient option for the task at hand.

This trend is becoming increasingly common, particularly as Information Management (IM) departments find themselves overwhelmed with the demands of maintaining and mastering the complexity of various tools and infrastructure. With limited resources, IM teams often struggle to keep up with the pace of technological advancements and user requirements. Consequently, operational teams may perceive IT as a hindrance rather than an enabler of their goals.

Managing identified issue

To address this challenge, organizations must prioritize resource allocation to strike a balance between maintaining existing systems and meeting the evolving needs of users. This may involve investing in user-friendly tools, providing adequate training and support, and fostering a culture of collaboration between IM and operational teams. By recognizing IT as a strategic enabler rather than just a cost center, organizations can better leverage technology to drive innovation and achieve their business objectives.

To effectively manage the complexities of interconnected systems and meet evolving user needs, organizations rely on a diverse set of skilled resources and capabilities. These include technical expertise in software development, systems architecture, database management, and network administration. Data management skills are also crucial, encompassing roles such as data scientists, analysts, and engineers.

Project managers play a key role in coordinating and overseeing architectural changes, ensuring alignment with business objectives and timely delivery. Additionally, change management specialists facilitate smooth transitions during system upgrades, maximizing user adoption and mitigating resistance.

Strong communication and collaboration skills are essential for fostering cooperation between Information Management (IM) and operational teams. Professionals must possess a mindset of continuous learning to keep pace with technological advancements. Moreover, enterprise architects provide strategic guidance, aligning IT initiatives with business goals and ensuring the long-term scalability and sustainability of interconnected systems.

Recognizing the scarcity of skilled resources in the IT industry, organizations invest in talent development initiatives like training programs and mentorship. They may also leverage external expertise through outsourcing or consultancy services to bridge skill gaps and drive digital transformation effectively. Ultimately, building a diverse and resilient workforce equipped with the necessary skills and capabilities is essential for navigating the complexities of interconnected systems and achieving strategic objectives.

The role of visual modeling standards UML, BPMN and ArchiMate

In navigating the complexities of interconnected systems, adopting standardized system modeling languages is crucial. These languages provide a common framework for representing system architecture, processes, and interactions, facilitating communication and ensuring consistency across teams and projects.

Among the standardized visual modeling languages, UML (Unified Modeling Language) stands out as one of the most widely adopted and accurate. UML offers a comprehensive set of diagram types for modeling different aspects of a system, including structure (e.g., class diagrams, component diagrams) and behavior (e.g., activity diagrams, sequence diagrams). Its versatility and richness make it suitable for modeling various types of systems, from software applications to complex enterprise architectures.

Another notable standardized visual modeling language is BPMN (Business Process Model and Notation). BPMN focuses specifically on modeling business processes and workflows, offering standardized symbols and notation to represent activities, events, and decisions. BPMN diagrams provide a clear and intuitive way to visualize and analyze business processes, making them invaluable for process improvement initiatives and enterprise architecture planning.

Additionally, ArchiMate is gaining traction as a standardized visual modeling language specifically designed for enterprise architecture. ArchiMate provides a holistic framework for modeling different aspects of an organization, including business processes, applications, data, and infrastructure. Its alignment with TOGAF (The Open Group Architecture Framework) makes it a popular choice for organizations seeking to develop and communicate comprehensive enterprise architecture models.

Overall, adopting standardized visual modeling languages such as UML, BPMN, and ArchiMate is essential for accurately capturing and communicating the complexities of interconnected systems. These languages provide a common vocabulary and notation for system modeling, enabling stakeholders to collaborate effectively and make informed decisions throughout the development and evolution of interconnected applications.

Alternative coming from Defence and other sectors

The defense ecosystem has traditionally relied on specialized frameworks and standards to model and manage complex architectures. These frameworks include DoDAF (Department of Defense Architecture Framework), MODAF (Ministry of Defence Architecture Framework), UAF (Unified Architecture Framework), NAF (NATO Architecture Framework), and MOSA (Modular Open Systems Approach).

However, it's essential to note that these frameworks are highly intricate and tailored specifically for the defense domain. Their complexity can make adoption challenging for organizations outside the defense sector. While the principles and concepts they embody may be applicable across domains, adapting these frameworks to suit the unique needs and contexts of other industries requires careful consideration and effort. Nonetheless, the structured approach to architecture modeling and management provided by these frameworks can offer valuable insights for organizations navigating complex systems and ecosystems, albeit with a potentially steep learning curve.

Also, other industry sectors have also developed their dedicated frameworks and standards to address the unique challenges and requirements of their domains. This proliferation of frameworks across different industries can contribute to what some refer to as the "Babelization" of the digital world, where a multitude of standards and frameworks exist, each tailored to specific sectors or niches.

For example, in the healthcare industry, there are frameworks like HL7 (Health Level Seven) and FHIR (Fast Healthcare Interoperability Resources) for standardizing the exchange, integration, sharing, and retrieval of electronic health information. Similarly, in finance, there are frameworks like ISO 20022 for standardizing financial messaging and SEPA (Single Euro Payments Area) for harmonizing payment transactions within Europe.

In manufacturing and engineering, there are frameworks such as ISA-95 (International Society of Automation - 95) for integrating enterprise and control systems in manufacturing, and ISO 55000 for asset management.

While these sector-specific frameworks are valuable for addressing the unique needs and challenges of their respective industries, the proliferation of standards and frameworks can also lead to fragmentation, interoperability issues, and increased complexity, especially when organizations operate across multiple domains.

Efforts to promote interoperability and convergence between different frameworks, as well as the development of overarching standards and interoperability protocols, can help mitigate some of these challenges. However, navigating the diverse landscape of industry-specific frameworks remains a significant consideration for organizations seeking to develop interconnected systems and ecosystems in today's digital world.

Capabilities for achieving interoperability in such a context

Achieving interoperability between diverse frameworks demands a multifaceted skill set encompassing technical expertise, interoperability standards, data mapping proficiency, middleware knowledge, testing practices, and collaboration abilities. Furthermore, successful implementation often requires specialized competencies in ontology, model-driven approaches, enterprise modeling, and digital organizational architecture.

  1. Technical Expertise: Deep understanding of technical specifications, data models, and integration protocols is essential.
  2. Interoperability Standards: Knowledge of XML, JSON, REST, SOAP, and other standards facilitates seamless data exchange.
  3. Data Mapping and Transformation: Proficiency in reconciling data format discrepancies ensures accurate integration.
  4. Middleware and Integration Tools: Familiarity with integration platforms streamlines connection and data flow orchestration.
  5. Testing and Validation: Rigorous testing ensures reliability, accuracy, and performance of interoperable systems.
  6. Continuous Integration and Deployment (CI/CD): Adopting CI/CD practices enables rapid iteration and evolution of interoperable solutions.
  7. Collaboration and Communication: Effective collaboration aligns stakeholders and resolves issues throughout the process.
  8. Ontology Skills: Defining common vocabularies and semantic coherence facilitates meaningful data exchange.
  9. Model-Driven Approaches: Utilizing UML, ArchiMate, BPMN, and similar languages helps capture and visualize complex system designs.
  10. Enterprise Modeling: Understanding organizational structures and processes identifies integration opportunities.
  11. Digital Organizational Architecture: Designing and optimizing digital ecosystems aligns interoperability initiatives with broader business objectives.

By integrating these skills and practices, organizations can prepare and build continuous operational interoperability between frameworks, ensuring seamless data exchange and integration across systems and domains.

Some new emerging technologies to consider

In addition to the previously mentioned skills and practices, addressing emerging trends such as data governance, data mesh, sovereign data platforms (e.g., Gaia-X), and leveraging emerging AI (Artificial Intelligence) and xAI (Explainable Artificial Intelligence) technologies is crucial for future interoperability efforts. Here's how these trends relate to interoperability and the skills needed to address them:

  1. Data Governance: With the increasing volume and complexity of data exchanged between systems, strong data governance practices are essential for ensuring data quality, integrity, security, and compliance. Skills in data governance frameworks, policies, and technologies enable organizations to establish robust data governance mechanisms that support interoperability initiatives.
  2. Data Mesh: Data mesh architecture decentralizes data ownership and processing, facilitating data democratization and scalability. Proficiency in data mesh principles and technologies allows organizations to design and implement decentralized data architectures that promote interoperability while ensuring data sovereignty and autonomy.
  3. Sovereign Data Platforms (e.g., Gaia-X): Sovereign data platforms like Gaia-X aim to create trusted and secure data ecosystems that enable data sharing while respecting data sovereignty and privacy regulations. Skills in sovereign data platform technologies and governance models enable organizations to participate in federated data ecosystems, fostering interoperability and collaboration while maintaining data sovereignty and control.
  4. Emerging AI and xAI: As AI and machine learning technologies become increasingly pervasive, mastering emerging AI techniques and ensuring explainability (xAI) is essential for building trust and transparency in AI-driven interoperable systems. Skills in AI development, model interpretation, and xAI techniques enable organizations to deploy AI-powered solutions that enhance interoperability while mitigating risks associated with algorithmic bias, fairness, and transparency.

By integrating these skills and addressing emerging trends, organizations can prepare for future interoperability challenges and capitalize on the opportunities presented by evolving technologies and data ecosystems. This multidimensional approach enables organizations to develop interoperable systems that are not only technically robust but also secure, transparent, and ethically responsible.

Interoperability: an utopia?

Introducing and mastering all of these technologies and practices indeed presents a significant challenge, and achieving widespread adoption may seem utopic at first glance. However, while the task is formidable, it's not insurmountable. Here's why:

  1. Incremental Progress: Organizations can make incremental progress by prioritizing initiatives based on their strategic importance and feasibility. By focusing on high-impact areas and taking a phased approach, organizations can gradually introduce and integrate new technologies and practices into their day-to-day operations.
  2. Collaborative Ecosystems: Collaboration within ecosystems, such as industry consortia, standards bodies, and open-source communities, can accelerate innovation and adoption. By pooling resources, sharing best practices, and collaborating on interoperability initiatives, organizations can leverage collective expertise and address common challenges more effectively.
  3. Adaptive Learning Culture: Cultivating an adaptive learning culture within organizations encourages continuous experimentation, learning, and improvement. By fostering a culture of innovation, curiosity, and resilience, organizations can adapt to evolving technologies and market dynamics more effectively, enabling them to master new technologies and practices over time.
  4. Partnerships and Alliances: Partnerships with technology providers, consulting firms, and research institutions can provide access to expertise, resources, and capabilities that organizations may lack internally. By forming strategic alliances and leveraging external expertise, organizations can accelerate their journey toward mastering new technologies and practices.
  5. Regulatory and Market Forces: Regulatory mandates, market pressures, and customer expectations can drive organizations to adopt emerging technologies and practices more rapidly. By aligning with regulatory requirements, market trends, and customer preferences, organizations can position themselves competitively while mitigating risks associated with technological obsolescence.

While introducing and mastering these technologies and practices may seem daunting, organizations can navigate the challenges by adopting a strategic and pragmatic approach. By prioritizing initiatives, fostering collaboration, nurturing a learning culture, leveraging partnerships, and responding to regulatory and market forces, organizations can gradually integrate emerging technologies and practices into their day-to-day operations, ultimately realizing the benefits of interoperability, innovation, and competitiveness in the digital age.

Coming back to Excel

Excel continues to hold a significant place in the landscape of technology and business operations, even amidst the emergence of more advanced tools and practices. Its versatility and familiarity make it a widely used tool for various tasks, including data analysis, reporting, and simple calculations. However, its role in the context of interoperability and emerging technologies is nuanced:

  1. Data Handling and Analysis: Excel remains a popular choice for handling and analyzing data, especially for smaller datasets or ad-hoc analyses. It can be used to perform basic data manipulation, visualization, and statistical analysis, making it a valuable tool for individuals and teams across different domains.
  2. Interim Solution: In some cases, Excel may serve as an interim solution for organizations transitioning to more advanced data management and analysis tools. While not ideal for large-scale or complex analyses, it can bridge the gap until more specialized tools and practices are implemented.
  3. Integration with Other Tools: Excel can be integrated with other tools and platforms to complement existing workflows and systems. For example, data from Excel spreadsheets can be imported into more advanced analytics platforms for further analysis or incorporated into business intelligence dashboards for reporting purposes.
  4. Data Exchange and Compatibility: Despite its limitations, Excel remains a widely compatible format for exchanging data between different systems and stakeholders. Many organizations still rely on Excel files for sharing data internally and externally, making it necessary to consider its compatibility and interoperability within broader data ecosystems.
  5. Data Governance and Control: Excel's flexibility and ease of use can pose challenges from a data governance perspective, as it may lead to issues such as data silos, version control problems, and security risks. Organizations must implement appropriate controls and governance practices to mitigate these risks while still leveraging the benefits of Excel's flexibility.

In summary, while Excel continues to play a role in various aspects of business operations, its place in the context of interoperability and emerging technologies is evolving. While it may still serve as a valuable tool for certain tasks and workflows, organizations should also explore and invest in more advanced tools, practices, and governance frameworks to address the challenges and opportunities presented by the digital age.

Conclusion

Should Excel not be used anymore by enterprises, when considering the damage it may cause to some enterprises when not properly used?

In order to respond to this question, we pointed out that Excel usage comes from some of its quality, but also from some practice issues when architecting the other pieces of the information systems to put in place.

It led us to explore the current challenges for digitalisation, in particular when having to deal with the continuous maintenance and evolution of the digital enterprise. It requires some specific capabilities, including those related to visual modeling when willing to master the growing complexity of what is to be put in place quicker and quicker. Even with the new practices, Excel will continue to play a role in various aspects of business operations, its place in the context of interoperability and emerging technologies will evolve but will have to be completed with more advanced tools, practices and governance.

Concerning continuous operational interoperability required by digitalisation, this is my expertise topic I've been exploring in my articles on LinkedIn. Don't hesitate if you want to learn more about the various needed capabilities mentioned in the article, such as standardized modeling languages and frameworks, and on how to master the underlying complexity.


Laurent Saint-Marc

Skywise Ontology Product Owner chez Airbus

8 个月

Excel is simple but not efficient at big scale. Model based management is more complex, but also could bring more benefits. How far are we really from an efficient model based management of the data, the process, the enterprise (including their interoperability) ? When I say "how far", it includes the languages, the tools, the recovery on the legacy, ... but above all the change of mindset.

Olivier REY

CIO at NHIndustries

8 个月

As Excel-based applications are concerned, we have to realize that some organizations rely on Excel to develop in VBA very complex applications because this is the sole development platform at their disposal. How many organizations in the French industry are not equipped with real development environments and real trainings (especially cyber)? Too many. In the aerospace area or in the banking area, we are facing similar problems. Too many managers still don't realize/understand/accept that their staff cannot survive without developing programs. If only Excel is available as a development platform, then Excel is used. If we were to accept that programming is a required skill for the daily life of many engineers, financial analysts, etc., even if this is not listed in their job profiles and they are not flagged as "IT", we could train people to code secure programs, protect the IP of their company and protect innovation. Without the acceptation that programming is a core activity of many employees, without IT to put in place the tools and trainings for the employees to develop properly, Excel will keep on ruling and the dependencies to it will almost be impossible to remove.

Jean Rohmer

President at Institut Fredrik R. Bull

8 个月

The situation is desperate. I am currently contacted by several bodies asking for a way to manage simply and intuitively heterogeneous datasets for personnal or shared use -a concern we all deal with daily- . I explain them we just have 'nos yeux pour pleurer' ('our eyes to cry'). We hat nice tools 25 years or more ago. But these tools just disappeared, were never replaced. And the 'just prompt reflex' will indeed worsen the situation. That is what computer science and information systems are all about

In short, the justification for incurring short-term easily-quantifiable costs, in return for long-term hard-to-quantify savings, will always be difficult. Technical debt is thus inevitable.

Jean Rohmer

President at Institut Fredrik R. Bull

8 个月

Excel anywhere: we just miss a true information theory! No real progress since 1981. And indeed LLM will still postpone the need for personal interaction with corporate Information systems, data bases, data models.

回复

要查看或添加评论,请登录