Parametric Thinking in Provisioning (from the Science of IFRS 9 and the Art of Basel III series)
The article series was published in the Suits the C-Suite column of Business World on June 2018. It lengthily discussed the evolution and rise of parameter-based provisioning for financial reporting and capital planning purposes, which is still the prevailing mental model underlying the expected credit loss calculations made by financial institutions. The article series suggested a gradual shift from parametric thinking to the use of coding drivers, specifically on the adoption of machine learning and monitoring of any progress on 'artificial intelligence' for risk management and provisioning calculations. This shift has been accelerated by the COVID-19 pandemic, which has crushed models and methodologies developed and trained under benign conditions.
IFRS 9 is an International Financial Reporting Standard (IFRS) promulgated by the International Accounting Standards Board on July 24, 2014. It addresses the accounting for financial instruments and features three main topics: classification and measurement of financial instruments; impairment of financial assets; and hedge accounting. It will become effective in 2018 and replaces International Accounting Standards (IAS) 39 Financial Instruments: Recognition and Measurement and all previous versions of IFRS 9. In this article, IFRS 9 is referred to as a “science” because of its systematically organized body of information and measurements on specific topics.
Basel III (or the Third Basel Accord or Basel Standards) is a global, voluntary regulatory capital and liquidity framework agreed upon by the members of the Basel Committee on Banking Supervision (BCBS) in 2010–11. It was scheduled to be introduced from 2013 until 2015; however, the implementation has been extended to March 31, 2019. Another round of changes was agreed upon in 2016 and 2017 (informally referred to as Basel IV) and the BCBS is proposing a nine-year implementation timetable, with a “phase-in” period to commence in 2022 and full implementation expected by 2027. Basel III was developed in response to the deficiencies in financial regulation that came to light after the financial crisis of 2007–08. Basel III is intended to strengthen banks’ capital requirements, liquidity, maturity profile, and leverage. It also introduced macroprudential elements and capital buffers designed to improve the banking sector’s ability to absorb shocks from financial and economic stress; and reduce spillover effects from the financial sector to the real economy. Basel is an “art” form in the context of the need to perform skillful planning and creative visualization in fully comprehending its dynamic processes and uncertainties.
Part 1
Financial institutions recognize that provisioning and stress testing need to go together, allowing, at any given time, the determination of the credit cost and capital usage of an account, transaction or portfolio. This desired state poses complex and tremendous challenges. It would be helpful to frame at the onset that these exercises can be broadly classified into two types, as an adaptation of Daniel Kahneman’s view on the two selves: Type 1 system for fast, intuitive and unconscious views, and Type 2 system for slow, calculating and conscious thoughts. At the risk of oversimplifying, we do not know yet which exercise will become which system, but what is clear is the emergence of parametric thinking to grapple with the foreseeable function required for calculating expected credit loss (ECL) provisions under IFRS 9 and the related capital usage that will be highlighted with the implementation of the stress testing rules under BSP Circular 989. Here’s a sample illustration on how exposures will be viewed in the coming months (stripped of technical assumptions): Assume a corporate exposure with a moderate quality rating, belonging to an industry that is exhibiting concentration risk, within a benign macroeconomic scenario. If the recovery experience is 65% and the overlay-adjusted probability of default (PD) is 1%, the ECL provisioning cost is .35% and the capital usage is 5%. If recovery experience falls to 55%, the provisioning cost is .45% and capital usage at 7%. If the macroeconomic scenario deteriorates — assume PD at 3%, the provisioning cost is 1.35% and its capital usage is 10%.
The illustration may make computational sense, but note the gap between the ECL and the capital usage. At some point, the ECL will increase to consider the “transmission” from the macro-economic assessment to the credit risk pertaining to the obligor, and this scenario is likely to happen as the IFRS 9 and stress testing exercises become clearly linked in the next 12 to 15 months.
This scenario requires adaptive yet rigorous models and estimation approaches, but the current situation is an irreversible progression from historical, incurred-loss oriented IAS 39 models to Basel-based techniques that are being extended to meet the expected loss criteria and forward-looking view of IFRS 9. As the techniques undergo development or enhancements, it would be helpful to view the provisioning exercise as consisting of parameter drivers — namely the base parameters for Exposure at Default (EAD), Loss-Given Default (LGD) and PD, adjusted for the overlay mechanism and the discounting process (these same parameter drivers can be used as inputs for portfolio management and capital planning, adjusted for horizon, confidence interval and other properties). There is literature available for these parameters so we will skip the introductory discussion and discuss three areas to strengthen the parametric approach to provisioning — clarity on the definition of default, strengthening the staging assessments, and plumbing the overlay mechanism.
Default and staging assessments should be clear, both operationally and in principle. The definition of credit impaired defines what should be Stage 3 for IFRS 9 and is loosely equal to our understanding of non-performing loan exposures. This definition is key for both modeling and estimation approaches, as well as disclosure purposes. The definition of default should be consistent with internal credit risk management practices, and for purposes of assessing significant deterioration, could be different to that used for regulatory models — not all default events are immediately considered credit-impaired. However, in practice, what we are seeing is that the definition of default is shaped mainly by regulatory requirements, and we would not be surprised with an alignment between financial and regulatory reporting for consistency and simplicity, especially for modeling purposes. This means that the 90-day definition looks like a prescription that will be generally observed, although the 30-day backstop does not automatically mean an exposure is considered in default — at most, it would attract a lifetime ECL until the default state is concluded, in which case there is already an outcome (i.e., PD is 100%) and the situation shifts to a recovery strategy issue. This is where financial institutions are advised to regularly perform their stress testing of those jumps or non-linear increases, on top of strengthening the governance around the staging assessments, ranging from the default tagging and classification process to early warning indicators and quantitatively-supported risk assessments to supplement a financial institution’s credit evaluation process.
We previously mentioned that institutions that adopted the now-replaced IAS 39 regime and are immersed in the internal ratings-based approaches of Basel will feel that there is a collective déjà vu, as quantitative and statistical techniques start to dominate the methodology discussions. There are actually three mental models that need to be fused and redesigned, with iteration through time, in coming up with an operationally rigorous IFRS 9 — IAS 39, Basel IRB, and stress testing. We expect a few of the IAS 39 models to be extended as interim measures under IFRS, before eventually being discarded or even mutating if the proxy factors become the norm, especially in micro and retail exposures. But most of the changes — especially for corporate and institutional exposures — will be borrowed from the IRB approaches, which would include adaptation of the capital requirement parameters of Basel, requiring high standards around governance and model development and validation. In its capital adequacy state, the IRB models are generally conservative that use downturn assumptions and scenarios, use a 12 month horizon and use a cost of capital discounting treatment (rather than the effective interest rate).
In the second part of this article, we will continue the discussion on IFSR 9 and Basel, looking at the parameters relevant to the base Expanded Credit Loss model.
Part 2
The spectrum of methodologies depends on the attributes of the segments and the degree of accuracy expected. These include estimating expected and lifetime loss assumptions from historical loss rates, roll rates (at either the aggregate or account level) and vintage curves to developing models for the Probability of Default (PD) and Loss-Given Default (LGD) parameters. For governance reasons, the technical aspects, features and assumptions of the models and estimation approaches should be thoroughly documented along with the points at which human judgment and intervention will take place. The limitations should also be described along with a discussion on how it will be addressed moving forward, what interim solution is in place (whether through a place holder number or proxy assumption), and if the resulting model risk is within tolerable thresholds.
For instance, loss rates, vintage curves and roll rates (e.g., Markov chain) are generally favored for the retail portfolio as these can be practically aligned with current risk management practices and provide an intuitive portfolio and term structure, especially for banks that are used to monitoring via segmentation and aging-based measures. The obvious drawbacks — such as backward-looking view, assumption of consistency in transition or delinquency movements, no capture of seasoning effects, slow reaction to changes in the portfolio mix and risk characteristics, recovery expectations that are difficult to incorporate, which render a 100% loss assumption when default stage is achieved — can be addressed by requiring multiple overlays and dynamic simulations to address the limitations that improve accuracy but also increase estimation risk.
In cases where models are built to explicitly calculate the PD and LGD parameters at the account, portfolio and facility levels, the more accurate models can be used for risk management purposes and even decision-support activities like pricing. Philippine financial institutions (FIs) that adopted models for certain exposures are aware of the “start-up” and continuing cost and investment required — building models requires significant effort, resources and time. Models also require rigorous maintenance, governance and validation. At this stage, the models that have been built may have produced quantitative results, but the real challenge is to allow these models to stabilize, learn and iterate. We estimate that FIs that have implemented models for IFRS and Basel purposes need another 12 to 15 months before gaining conclusive results.
Ensuring thorough documentation also helps drill institutions towards the full-scale use of machine learning. As the models and estimation approaches “learn” through time, complex computations will consolidate into pockets of decisions and will respond directly from the raw data footprint, which could range from sensor and mobility data used to evaluate logistical and supply chain-oriented customers to flow-based financial variables (as opposed to ratios). The implication is significant — the modeling and estimation approaches will bypass the stage of structured data and calculation parameters and enable the codification of decisions. It is just a matter of time before the parametric thinking approach to calculating expected credit loss (ECL) provisions and economic capital will be dislodged by the rise of “coding drivers.” Future-proofing exercises should therefore be applied, and we will come back to this with an illustration for corporate and institutional exposures.
What we have covered so far are the developments at the base ECL model — the composite PD, LGD and Exposure at Default (EAD) parameters — that reflect idiosyncratic or specific risks pertaining to the exposures. The other element that needs scrutiny and improvement in the coming months is the overlay mechanism, which, in IFRS, is intended to capture the forward-looking view and the interdependent relationships within the wider economy. To be specific, the overlay mechanism represents an institution’s own economic reading, which makes the IFRS 9 ECL process a foreseeing exercise of marking-to-model and marking-to-view.
This is where stress testing will be useful for FIs in plumbing the overlay mechanism. Stress testing also includes macroeconomic forecasting models that have evolved out of the need to support internal stress testing for financial and capital plans, as opposed to the regulatory stress testing that are currently designed to be uniform and which tend to be blunt (think of the real estate stress testing exercise). By design, stress testing is prepared for both immediate and long-term horizons and incorporate forward looking scenarios and interdependent factors. These properties — adjusted for the downturn scenarios — are what would help strengthen the overlay mechanism. The stress testing approaches we are seeing in the industry are first-generation models that have at least served the purpose of informing the IFRS 9 modeling and estimation approaches. The stress testing approaches are currently aggregations of calculations and processes that require a lot of manual intervention and judgment, ranging from the work-in-progress integrated stress testing used for strategic and corporate planning, financial and capital planning, and enterprise and business risk assessments to the resilience planning that underlies the capital adequacy and recovery planning. This naturally leads to confusion on the application of the forward-looking economic view and the probability-weighting of scenarios. The stress testing models we have seen in the industry need to be repurposed as dynamic and agile, and we expect another 12 to 15 months for development and strengthening. This improvement is timely given the full implementation required for the stress testing and macro-prudential regulations by 2019 at the latest.
In the third part of this article, we will continue with what FIs can expect in the next 12 to 15 months.
领英推荐
Part 3
The timing of BSP Circular 989 and the adoption of IFRS 9 will be of equal interest to regulators as well as boards and management, who are keen on understanding the impact on financial institutions’ (FIs) loss-absorbing capacity under stressful conditions and implications for macro-prudential policy on one hand, and the strengthening of strategic plans on the other. FIs are expected to deal with expected credit loss (ECL) and time series data sets and calculation templates at granular and portfolio levels and draw upon multiple scenarios using their own expanded methodologies. They will need to achieve clarity on which would be considered base case and stressful scenarios, in order to help establish the range that would feed into the overlay mechanism of IFRS 9. When this development happens, the top-down and bottom-up approaches to adjusting Probability of Default (PD) for the overlay mechanism will become manifested in the coming months, so it is helpful to understand these two simultaneous processes that may converge (with the corporate and institutional exposures in mind).
Under the bottom-up approach, the PD is determined from the base credit risk model that accounts for idiosyncratic properties before it is adjusted for industry-level factors. The final adjustment is the overlay of the macroeconomic scenarios. Methodology-wise, this process involves recalibrating the rating or scoring PD models to incorporate macroeconomic factors. In practice, and for communication purposes, it would be helpful to distinguish the base PD and the corresponding overlay adjustment, which could be illustrated as a scalar or multiplier of 1 to 1.2 given an intense view of the economy.
On the other hand, the top-down approach is influenced by macroeconomic modeling that may involve auto-regression and would use a combination of an underlying Basel PD model and a portfolio model associated with stress testing. This Basel PD model produces a long-run or through the cycle PD that requires scaling, such that the portfolio average PD matches the predicted PD from the stress testing model. Forward-looking macroeconomic factors are applied in this exercise, with a scalar derived through optimization when linking the two models. In addition to regression, single factor models and credit index approaches may also be employed for top-down approaches.
Currently we are observing more bottom-up approaches being employed by the industry as it improves its base credit risk models and the relevant industry factors given the segment of the exposure. With the introduction of BSP Circular 989, we would expect top-down approaches to be revisited.
At some point within the next 12 to 15 months, we would expect a ‘VaR to VAR’ methodology connection between IFRS 9 and stress testing. From Value at Risk models to Vector Autoregressive models and back, this development could usher in the second generation of overlay and stress testing models that would allow economic forecasting (and potentially reduce the probability-weighting exercise to a sense-check exercise rather than as the main input) and incorporate lifetime and transition criteria. Regardless of the advancement to be implemented by financial institutions, there are helpful operational guidelines to be noted. The first point is that the exercise could result in an unintended front loading of losses, leading to capital erosion. The second is for any stress testing methodology and overlay mechanisms to be connected to internal risk management, notwithstanding the regulatory floors that may be imposed for capital adequacy purposes. The final point is for the Board to be directly involved in the identification and evaluation of stress scenarios, the stress test interrelationship map and oversight on the macroeconomic projections and its linkage to the institution’s resilience plan. Readers may refer to our July 26, 2010 article in this column, “Stress Testing as a governance tool,” for more guidance.
While the overlay mechanism prepares us for the foreseeing function and expansive view, let’s not lose sight of the tightening of the data, systems and processes within the base ECL model. In particular, for the PD determination for the corporate and institutional exposures, we are recommending the following that should be viewed as loops and iterations rather than as a set of finite linear steps:
1. Segmentation process -—?covering the traditional data processing and management, risk profiles and internal risk rating system, with a subset for emerging and unstructured data capture assessments, “Big Data Small Data” initiatives, and clustering of observed attributes and properties.
2. Credit evaluation —?covering mainly the financial condition, industry assessment and outlook and management quality of the corporate and institutional customers.
3. Assessment of factors and variables —?covering both single factor and multifactor analysis, analysis and binning, and other approaches used to ascertain the relationship between the data points to the intuition and judgment of experts to be used for the model selection step.
4. Model selection —?covering model runs that will result in candidate integrated models (composed of main and sub-models); these models initially start with an optimization algorithm of instructions and eventually “learn” over time; it may take another two to three learning rounds over the next 12 to 15 months to help stabilize the PD models.
5. PD transformation —?covering the derivation of the through-the-cycle and point-in-time PD from the models chosen.
6. Portfolio analytics —?assessment of the results against the internal policies and portfolio management, which then feeds back to the segmentation process.
This “future-proofing” recommendation will help FIs transition from parametric thinking to the rise of coding drivers — specifically on the adoption of machine learning while monitoring any progress on artificial intelligence for risk management and provisioning calculations.
At this point, the emerging parametric thinking underlying the ECL calculation has established the boundaries of PD and Loss-Given Default (LGD) to reflect both idiosyncratic properties and to a certain extent — through the overlay mechanism — the systematic risk that the obligor, or broadly the portfolio, is exposed to. But what is this systematic risk factor? Are we still talking about the generic market or financial economy? Or should this now be expanded to include “funding the real economy” discussions?
The connection between Basel and IFRS 9 has been limited so far to excessive concentration, contagion and spill-over risks. What has not been covered are the network and transmission risks that arise from stagnation. In an upcoming article, we will examine its application to the areas that have the strongest potential to break inertia and have an impact on the economy — agriculture and infrastructure.
Christian G. Lauron is a Partner of SGV & Co.