Modern Operational Risk Modeling & the need of Model Risk Management
Amal Merzouk
Principal Product Manager | AI & SaaS Product Innovation | Digital Transformation | Finance, Risk & Compliance | Data-Driven Growth, PRM CAMS RWRI MBA
Introduction: Current State of Operational Risk
Financial firms have spent millions of dollars on operational risk measurements and quantification, but with limited success, since they have generally chosen a traditional audit-based approach. The Traditional Operational risk is inherent in all banking products, activities, processes and systems, and the effective management of operational risk has always been a fundamental element of a financial institution’s risk management program. As a result, sound operational risk management is a reflection of the effectiveness of the board and a senior management in administrating its portfolio of products, activities, processes and systems. The Basel committee[1], through the publication of the chapter, desires to promote and enhance the effectiveness of operational risk management throughout the banking system.
Traditional operational risk measurements and quantification provides structure, governance standards and an instinctive approach to risk identification and assessment, but mathematically connects risk with the average, or expected loss. Still, risk is more suitably characterized by the unexpected, or even “worst case” loss. This inconsistency has enormous implications. Specifically, Traditional operational risk management depends on significantly on the RCSA process, which miscarries to disclose the real risk and emphases instead on mutual threats and control weaknesses related to predictable losses. In fact, organizations using this approach frequently become over-controlled in the areas where, they have the tiniest risk and continue expressively under-controlled in the areas where they have the greatest risk. Traditional operational risk management is also projected to be implemented at a granular/process level therefor the assessment metrics are not additive (Likelihood cannot be aggregated), the resulting “risk” figures cannot be aggregated. For all these reasons, most managers in financial institutions find it very difficult to use Traditional operational risk indicators in a systematic approach.In contrast, Modern operational risk management is a top-down approach, which emphases first on the main risks — within a complete and mutually exclusive risk architecture — and which drills down only in those risk areas where more granularity is prerequisite. Modern Operational Risk Management can be executed for tactical and strategic decision making. While best practices and regulatory guidelines are readily available for both the qualitative and the quantitative elements of Traditional operational risk, many financial institutions are still struggling with the practical implementation of traditional operational risk frameworks for mainly operational risk measurement. And still the operational risk measurement for mitigating operational risk is becoming a main concern and a high priority for financial institutions that want to escape penalties, cut the likelihood of regulatory audit and reestablish blemished reputations.
Operational risk measurements
This priority is higher for the Fintech companies as the use of vendors or suppliers and third parties raises unique challenges for operational risk management. While activities and controls may be outsourced, operational risks are not. The firm still owns the risk. Therefore, it is necessary to ensure that there is a robust due diligence process to monitor. That’s provoking a step forward in the way operational risk is noticed and achieved for Fintech companies– moving away from a siloes and towards a technical culture in which many experiments are underway on AI, blockchain, predictive data analytics, Internet of Things and cloud computing to support both disparate structured and unstructured operational data . Many of these are intended to support understanding of operational risk modeling and measurement.
The most obvious use for the risk modeling is to calculate regulatory capital under the advanced measurement approach (AMA) – the Basel Committee on Banking Supervision's own-models approach to op risk capital. Unfortunately, the committee disclosed plans to fight the AMA and replace it with a standardized measurement approach [2]in March 2016. However, the AMA modeling may have other uses, which are not limited to the Banking but also to the Fintech institutions. Moreover, the standardized approach is backward-looking, is not intended to recognize the behavior and drivers of risk, and does not reproduce risk mitigating business decisions such as insurance. With losses left over part of the historical data for 10 years, the effect of reduced losses on the regulatory capital requirements owing to enhanced risk management practices will comewith a long gap. For PwC and regarding operational risk modeling, “such techniques increasingly include analytical tools and models designed to support management decisions, rather than regulatory capital calculations. Hence, while the demise of the AMA (Advanced Measurement Approach) may spell the end of internal models for capital purposes, it may well free up analytical capabilities to develop different internal models that could arguably be far more useful to the management of operational risk than the AMA.”. Mainly, the PWC message explain that with the limitations of AMA raised, the industry is now free to discover new modeling approaches more suitable to serving the needs of financial institutions. The focus must be to improved support business decisions, promote risk awareness and understanding, set the right Keys Risks Indicators for staff, and facilitate “what if” analysis to avoid repeating Deutsche Bank’s experience in 2015, where insufficient compliance procedures in its overseas offices led to a $258m fine for flouting US sanctions laws.
If operational risk management is founded on historical loss events and reserving the right amount of capital to guarantee that the financial institution was secure if these events occurred. Next finishing their front-to-back assessments, financial services institutions now need to look forward and embed the predictive analytics within their target-state operating model to drive management decisions with a support massive amounts of disparate structured and unstructured operational data within the goal of management of key risks, explicitly the optimization of risk-reward, risk-control and risk-transfer in the situation of cost-benefit inquiry. Therefore, incorporating the risk sensitivity in capital charge calculation is very important. It is vital for financial institutions to appropriately quantify operational losses in order to accurately determine the loss component. But still when quantifying operational risk, most issues revolve around sparseness of data, and quality or reliability of data. Much of the use of data analysis for audit, risk, and compliance starts with procedures that are relatively ad hoc: often one-off explorations or profiling of data in order to determine risk exposure and identify compliance problems around a specific business process area. Risk analyses must extend beyond collecting and presenting sketchy facts and data into simplistic heat maps that occasionally motivate convincing decision making. Slightly, using a combination of statistical analysis and data science technics to overwhelm the compliance issues depend on financial institutions’ business lines and they contain:
- Process approach (Bayes Nets, statistical approaches such as conditional probabilities may be based on scorecard and/or historical data from the trading book or balance sheet. They can be used to model loss distributions, or the distributions of key risk indicators)
- Factor approach (risk indicators, forecasts on economic models, CAPM model)
- Actuarial approach (parameterization using historical data, empirical loss distribution, extreme value theory, Scenario Analysis and Stress Testing and back testing for validity, and it is expected that models will continue to evolve as experience develops. The validity and verification of the requirements), In addition to providing great detail of the many analytics’ and statics methods used in operational risk, like : Aggregate loss modeling, Extreme value modeling, Dependency modeling using copulas, Statistical methods in model selection and calibration, and Ample exercises to further elucidate the concepts in the text (mainly by capturing the disparate structured and unstructured operational data)
Still, preventing a repeat of the 2008 global financial crisis will require a paradigm shift in risk management practices. Risk models can deliver valuable insight into complex problems, but the excellence of these models and serious assumptions need to be corroborated by objective and independent professionals. In specific, these models must be able to incorporate both empirical data and expert opinion in a credible, transparent and theoretically valid manner. The Modern operational risk models will better address the requirements of financial institutions through these 2 main characteristics:
- Decreases model uncertainty given the better orientation among the available historical data and the confidence interval of the estimation: like Shorten the methods requisite by using a lower confidence level assumed the 1-in-1000 year benchmark under AMA has proved impractical.
- Increases model accuracy and lets to enhanced data scientists technics to check predictive accuracy of models such as backtesting.As the use of models increases to support Modern Operational Risk, this is a part in which the potential impact of faulty modeling can be great, and it is also an area receiving more supervisory consideration and examination.
As the use of models increases to support Modern Operational Risk, this is a part in which the potential impact of faulty modeling can be great, and it is also an area receiving more supervisory consideration and examination. Organizations can benefit from a structured approach to model risk that incorporates both a Model Risk Management framework for evolving and testing of models as well as effective governance of models such as model validation, inventory and audit. When correctly considered and implemented, models should be a valuable asset for financial institutions, but firms need well-conceived programs to recover models’ utility while identifying, quantifying and mitigating their potential risk.therefor, and to ensure clear, enterprise-level oversight throughout the model life cycle, one needs to keep executive management and supervisors “au fait” on model status across. And increase valuable insight for making accurate risk-reward decisions by centralizing model inventory and model assessment capabilities in a common library, that supports the internal policies and procedures across all business units and where one can import model attributes and metadata – e.g., model validation results – from any kind of model established in any technology (R, SAS, Python on others).
CONCLUSION
By implementing the approach of Modern Operational Risk modeling that looks to better capture and replicate the drivers of operational risk while limiting implementation costs, the Operational Risk modeling methodology pursues to enhance the balance between accuracy and straightforwardness. Banks and Fintechs cannot correctly manage operational risk, without measurement in the new era and the industry should converge on a new set of modeling best practices that create a more stable and useful set of tools for operational risk measurement and In order to mitigate possible evolving risks, the Model Risk Management (MRM) has to be considered to support evolving supervisory and business agendas.
[1] Basel Committee on Banking Supervision, “Basel II: International Convergence of Capital Measurement and Capital Standards: A Revised Framework Comprehensive Version”, June 2006
[2] Basel Committee on Banking Supervision, “Standardised Measurement Approach for operational risk consultative document”, March 2016