Data Architecture Challenges in Banking and Financial Services Environments
Watch out for this; this can go wrong…
The more complex a business is, the more sophisticated its data architecture becomes, and we know that the banking and financial services sector is one of the most tangled industries from the data management perspective. It’s a multi-disciplinary, heavily regulated and technology-intensive environment with heavy reliance on data and information management. Data management helps financial sector businesses identify risks and opportunities, understand and serve their customers better, define products that are tailored to their customer needs and make informed decisions about their operations.
A financial organization's target enterprise data architecture shapes the data management strategies, identify the functional and non-functional requirements, identify the design guidelines for the underlying data management components, and sets priorities and roadmap of changes for implementing the change in the data management environment with clarity and vision of the organization’s data assets so that the performance of the business, customers, processes and service can be measured critical timely business decisions can be made and various value drivers enabled. So, it goes without saying that the banking business requires competent enterprise data management capabilities that cannot be delivered without a solid underlying enterprise data architecture.
On the other hand; building efficient data management capabilities within the banking/ financial services business environment comes with a set of challenges, most of these challenges are present in other business domains, but they can be more pronounced in the financial industry for the mentioned characteristics. this article is aiming to make some form of?articulation?about a few of these challenges based on my personal experience delivering such capabilities for the financial sectors for several years.
By all means, this is not expected to be a full or comprehensive list of what can go wrong when trying to design and build an enterprise data management architecture, but the intention was to provide discussion starters and general awareness about some of the most impacting challenges and risks: ?
(1)??The sponsor challenge: Not having a business sponsor, or having a sponsor who does not possess enough influence on the or a complete vision of what is expected from the target architecture and each phase in the transformation roadmap Some organizations initiate the target data architecture definition as an information technology initiative or based on a recommendation from external consultancy providers without on-boarding enough business stakeholders. In such cases it cannot be guaranteed the build of the target data management architecture and its capabilities will assure delivery of corresponding business value; the utilization and sustainability of the target architecture might also be doubtful in such cases.
(2)??Engaging enterprise data architecture design as part of a limited scope implementation initiative:?Enterprise data architecture should be transcendent to partial engagements and tactical initiatives, Enterprise data architecture should provide the strategy, governing principles and design for various data management capabilities that are not limited to the scope or vision of a single use case or a project scope, even a large scale data management project such as building a data lake or an enterprise data warehouse should not be the basis upon which an enterprise data architecture is defined. Ideally; architecture design should be a separate stage that is carried out by the enterprise architecture/data architecture community with the involvement of cross-business domain stakeholders. The enterprise data management architecture and the change roadmap emerging from it should be the basis of planning future projects and initiatives not the other way around. ?
(3)??The deficient perspective challenge: Building the target architecture around defining “What capabilities and technologies to implement?” without emphasizing “Why do we need it?” and “How it can help our business”, such style is very common, many organizations might engage in large data management transformation projects only because it is “Trendy” or “Commonly implemented in our industry” without sensibly planning the value proposition of each change. Every environment has influential enthusiasts who believe specific approaches, components, capabilities or technologies are “must haves” or “the next cool things” and it is one of the hardest tasks of the data architecture and governance communities to rationalize such excites, making sure that every building block of the target data architecture is required, fit for purpose and not creating redundancy.
(4)??The short-sighted planning challenge: Designing the target data management architecture exclusively based on an initial set of business requirements or use cases; this approach might be useful for building a data management component within a single business discipline such as a departmental data mart but can be very awkward to follow on the enterprise level of data management architecture. Target data management architecture must be approached with a “holistic vision” of the requirements and “promised value proposition” combined with knowledge of the industry-specific strategies and technology offerings, combining this knowledge and know-how areas for safeguarding the architecture stability and support the architecture future-proofing process.
(5)??Hasty decided changes: Rushing toward the demise, replacement or change of perfectly usable systems and capabilities, a good data management architecture should always capitalize on the existing capabilities a change should only be proposed on the target architecture if a certain function cannot be efficiently provided using the existing systems or capabilities, examples of improper reasons for proposing change are “Because we want it on the cloud” or “Because it is too old” or “Because it is not the modern way of doing it” - if it works don’t fix it!
(6)??Overcomplicating the data architecture: Often; most data architecture designs begin fairly neat with straightforward implementation and a small set of functions that can be easily supported, but during the build phase; new stakeholders are on-boarded to implement the design, and they are usually coming from a diversity of background and have their own opinions
(7)??Disconnected data management islands: in a large banking environment; it is common to see the data management practice divided vertically (e.g. by business lines: Commercial banking, Retail, SME,..) or horizontally by technology or data management domain (e.g., the on-prem data lake, the cloud data management, the data warehousing and BI,… etc.); it is not uncommon to see various levels of isolations between these island of data management, each is designing their own target architecture and designing their own solution portfolios, this isolation can induce redundancy, incompatibility or poor alignment of architecture targets and data availability timeframes, it can get even more problematic if the architecture governance authority is also scattered across multiple business disciplines. ??
(8)??The time to market challenge: designing and building a fully-fledged data management architecture is typically a large-scale engagement, especially in the financial sector, a typical data management platform consists of data accusation, preparation & transformation, storage, curation, processing transformation, and finally exploration and utilization, the means that the value realization comes at the last stage after several prolonged activates, prolonging time to value and probably conflicting with the business stakeholder expectations. The project team might find difficulty securing funds or face an unsatisfied stakeholder situation unless certain measures are implemented such as:
(9)??Underestimating the operational cost: in many cases, an enterprise data management platform is designed with a limited set of use cases in mind, but if the platform shows an acceptable level of success for early adopted use cases; the demand for utilizing the platform and for onboarding additional use cases will rapidly grow. This rapid demand will saturate the data management resources (infrastructure and teams), and adding more resources can significantly impact the operational cost plans ?
(10)?Choosing the wrong data architecture patterns or frameworks: an enterprise data architecture design and implementation project is not a “research and development” activity and should not be subject to trial and error, it is an expensive and complex project that needs careful weighing of options, ?the selection of the architecture design patterns for banking industry based on its success in other business sectors like telco or social media can be a major mistake, the benefits and limitations of the key design decisions need to be carefully understood and success/failure stories within the business domain needs to be studied, decisions like what will be the impact of converting my batch data pipelines into event streams or should we use serverless functions or PAAS functions are not to be taken lightly ???
领英推荐
(11) Undervaluing the importance of metadata management and data modelling: Data have very low utilization if it is low in discoverability and “consumability” – the consumability term is coming from IBM literature- both qualities are enhanced if the data is well defined and modelled in a way that suits the consumer needs, for example, if the main focus of a big data management imitative is “Ingesting Data” without worrying about the data model at the early stage (this is a common malpractice practice in many big data projects), the gap between the ingested data and the usefull data will grow significantly, leading to write-only data and knowledge gap that is very hard to bridge with later dedicated data exploration and modelling activities.
(12) Incorrect level of details: engaging in a data management architecture and massive size projects without an adequate level of details and well-though functional and non-functional requirements can lead to all sorts of issues during and after the execution and delivery of the data management strategy, and functions, it is not considered adequate to have a “requirement” of this level of abstraction :
Requirements of such inadequate level of details can lead to directing massive investment in the wrong direction and might cause loss of confidence on the target solution if the business and technical expectations are not carefully managed. ??
(13) Inflexibility, being too rigid about design decisions and alternative choices: ??when building something that is large and complex, we sometimes need to be less idealistic and more pragmatic to be efficient in achieving the required results, everyone in the architecture community understands that some design guidelines are not written in stone and it is not the end of the world if an alternative decision was found “More suitable" in certain cases, I can give an example: you might be advised against a monolithic data management platform (depending on your definition of a monolith), but we know that any data warehouse or data lake is a large relatively rigid system with so many interwoven functions and limited room for modularity or microservices, we always try to design such solutions for maximum modularity and integrability but there are limitations of feasibility and practicality that we should always consider.
(14) Heavy reliance on coding: Every single operational, IT or data management function can be implemented using intensive coding if you have enough resources and time; however, a bank is not a software company and building things from scratch comes with its own set of challenges, that includes but not limited to:
To avoid such complications; a closer look into alternatives might be required:
(15) Poor planning for staffing and resourcing: the data architecture and data management resources are really scarce. resources with banking/financial services experience are even more scarce. additionally; most data architecture and data management engagements are resource intensive, and the availability of such resources should not be taken for granted.
(16) Prolonged and complex data and architecture governance: data and architecture governance are gateways to assuming correct data management and protecting the investment of the organization. But if the governance process is taking a long time, it is going to impact the efficiency and responsiveness of the corresponding activities
(17) The unsuitable partner selection: we talked about the scarcity of qualified resources with financial and data architecture experience, and this is one of the reasons most financial organizations choose to partner with a consultancy/service provider to guide / design /implement or supervise these activities when done by a third party, the selection of such partner is a sensitive matter, one aspect is that the partner should possess the technical know-how for the designing/building of the data architecture and another aspect is having enough domain knowledge in the banking/financial business to allow utilization and mapping of the functions/capabilities to potential use cases and business value.
?
These were an initial set of challenges, the conversation is open, please tell me what you think and what additional challenges need to be added.
Mohammed Othman August 1, 2022?