Dynamic Data Governance in Banking, with Data Piloting.

Dynamic Data Governance in Banking, with Data Piloting.


“In God, we trust. All others must bring data.” ~W. Edwards Deming, the eminent scholar and acclaimed total quality management guru famously once said. And how well his words resonate within the modern banking industry wherein data is veritably the oil that runs it. Though not just any data but the trusted and governed kind; which is critical for banks to survive, thrive, and transform.


In this article, I aim to share the importance of dynamic data governance in banking, and how it ties in with the legal requirements of organisations operating within this sector. Further, by exploring the core operating mechanisms used to create revenue, as well understand what the consequences are when there is no governance in place to mitigate risk.

Next, I'll share why global banks chose Vokse Data Piloting, helping them attain an elusive yet essential goal of bridging the gap between business context and the technical data.


Credit: Image by RedPort Insights


Data is core to the banking system, where these banks have access to huge amounts of data. The potential sources are vast: financial data, client data, mobile interactions, social media data, sales data, market reports, and so on. And almost all decisions are undertaken with the help of data. These critical decisions range from loan approvals and liquidity reserve management to offering superior products, various financial services, and experiences that help customers fulfil their material needs via timely access to finance. Harnessed properly, these goldmines of data can provide invaluable insights. The actionable insight from information helps every department and function within the bank. But harnessing this data and information properly is half of the challenge. Ultimately, banking mechanics depend on trustworthy and reliable access to data, which is why data governance is key.

One of the primary objectives of data governance in banking is to enhance the accuracy and reliability of data, for its use in effective risk management. Taking risks is the business of banking (or any corporate entity). Risk management involves a set of tools, techniques, and processes that focus on optimising risk-return trade-offs.

The aim is to use trustworthy data to measure risks, to monitor and control them. While data informs every banking area, managing risk is one of the most important use-cases that relies heavily on maintaining a single source of the truth. Thus, it would not be an overstatement that good data governance ensures readily available, high-quality, and relevant data that spells the difference between a successful bank and one that’s destined to fail.


Risk Assessment 101

Banks rely on assets such as loans, securities, and stocks to produce income, which is unique in the corporate world. Companies operating in other sectors, such as software, rely on product sales to generate revenue, whereas banks make most of their revenue through the interest earned from the issuance of loans.

Thus, the loans that the bank provides to its borrowers are assets; while the deposits that you & I deposit with the banks are its liabilities. Ultimately, banking is all about maintaining a delicate balancing act between assets and liabilities by enhancing revenue-earning potential; while balancing credit and liquidity risk, among other risks.

For example, each loan has a different risk profile. Banks must determine provisions for loan loss based on the cumulative risk profile of their entire loan portfolio. Loaning to government agencies has a far lower risk profile than giving loans to early-stage start-ups, so the provision required would likely be smaller. Creating a delicate balance is an art and science where regulation plays a heavy role, so executives do not take extreme risks to make short-term profits.


Calculating Risk 101

Risk is calculated and monitored using various metrics, such as Debt Income Ratio (DTI), Loan to Value Ratio (LTV), Debt Service Coverage Ratio (DSCR), Probability of Default (PD), Loss Given Default (LGD), etc.

Crucially, each metric must be clearly defined across the organisation with an emphasis on data that is trustworthy. Otherwise, there is a danger that different divisions will calculate these metrics differently, causing confusion and skewing the results.

Calculating individual risk profiles over the entirety of a bank's customer base is a challenging feat. It can't be done on a spreadsheet. It needs complex structures such as data warehouses and other data management tools.

Credit: Image by Rochak Shukla on Freepik


Ultimately, to ensure long-term financial viability, banks must conduct numerous, continual stress tests to simulate the strength of their balance sheet under varying interest rate and credit risk scenarios. For this simulation to work consistently, given the dynamic nature of the market variables, the correct definitions must be in place, the data must have valid values, and it must be of high quality.

Beyond this, banks must ensure the right people can access the correct data. All these actions fall under data governance. This level of governance is also required for another aspect of banking regulation: Compliance. While compliance regulations directly benefit customers, they also ensure that banks follow the necessary procedures to ensure their calculations are correct and their data is solid. This means that, by law, they must ensure they operate in a way that benefits the bank and its customers.

Effective risk assessment and monitoring are necessary to ensure the health of any financial institution, as brought into sharp focus again through recent bank failures. Using established enterprise risk management techniques as a gold standard to examine risk holistically and systematically, are heavily dependent on reliable and governed data.


Data Piloting, a new paradigm.

Put simply, banks can’t function without data. While many put forward the flashy expected results of their data science journey, data management within banks is a huge task. It requires teams of data experts who can build data warehouses, mine data, understand the complexities of the financial landscape, and do all this while developing novel approaches to working with it. Data engineers and data architects are vital to effective financial data management.

To manage data, data teams integrate a complex stack of applications. Today’s data management were designed when data was much more static. In fact, that’s why most data governance, cataloging, cartography solutions are metadata centric. Observability solutions monitor data pipelines from a technical point of view and alert when changes occur in the data landscape, pipeline structure or data stack. These tools require complex integrations and need to be continuously updated manually or at best semi-automatically. This manual maintenance is time consuming and prone to error.

Today, data is increasingly dynamic, it is exponentially growing and comes in all shapes and sizes. While most of this data is digitised, most lack any structure at all. And with real-time data constantly changing or streaming in, bringing order to this chaos with existing tools designed for near static data is a headache. Governing and observing just doesn’t cut it anymore in the dynamic data landscape.


Credit: Image by starline on Freepik


While governance & observability brought some temporary relief to the data management challenge, they fail to address the major requirement of trust holistically. Analysts widely report that not knowing what data a bank (or any organisation) holds and lack of trusted data are the top two reasons for failing data initiatives.

Growth brings pressure to operate efficiently. Banks are expected to do all of this with less resources and lower costs. Consistent and accurate data becomes a competitive advantage and a catalyst to grow in an efficient, cost-effective way. To drive adoption of data initiatives by teams across functions and departments, they need to be confident that your data can be trusted. This requires building and maintaining the link between business data with technical data.

The costs of inconsistent data are high: They include the tools, systems used to manage them, and all the labour required to design, connect, and maintain these systems. The largest costs are unquantifiable: the inability to design exceptional customer experiences, failing operations, complex decision-making, compliance risks, and inability to innovate. Banks with trusted data tend to make faster, better decisions and simplify supporting infrastructure.


Vokse Data Piloting.

Piloting dynamic data is essential for banks (or any organisation) to remain relevant. Knowing instantly what data you have and assess instantly whether it can be trusted is paramount to drive business value.

Having a real-time and up to date map of all your data and metadata is a great start. Vokse Data Piloting goes beyond traditional tools by automatically creating and maintaining the link between business and technical data. It also creates and maintains your data cartography instantly and enables you to browse your data from any relevant point of view: technical, business, organisational, applications. So, whether you are a data, business, or an IT professional you can understand what data you are working with and know if it can be trusted.

Also, it allows you to analyse and audit your data continuously so you can identify data quality issues, data silos, risks and maintain regulatory compliance. It will alert the relevant stakeholders so data and pipeline issues can be dealt with well before problems impact the business or the customer experience.



To summarise: Vokse is at the intersection of data governance, observability, data catalogs offering a holistic approach to data management, as it centralises 10+ data functions within a single cloud native platform drastically decreasing data stack complexity, whilst increasing project success & business value. Offering a unified real-time view of dynamic enterprise data, as well as meta-data; by building the link between business context and the technical data. This allows for advanced automation ensuring issues are identified pro-actively prior impacting the business.


Learn more by visiting?www.vokse.eu?| www.dhirubhai.net/company/vokse-dpa OR give me a follow!

Rohit Kumar

Founder and CEO | Value Creation for Big FIs and Small Firms

1 年

Great article. Thinking a level further, Vokse can even bring this concept of the Data Trust Index within the organization, fostering a culture of better data quality. Well done!

Christiane E.

Data and Information Governance & AI | AI Ethics & Policies | Functional Data Architecture | Tech Lead & Data/Information Governance Advisor

1 年

Hummm! Interesting!

要查看或添加评论,请登录

Ankit Mehta的更多文章

社区洞察

其他会员也浏览了