FP&A For Financial Services Companies: Beyond the Basics
The world of financial services is notoriously complex and it relies on many different interactions between different departments, tools, and data sources. And success relies on this system managing risk and reward at scale, while still adhering to the stringent regulatory restrictions that impact the industry.
None of this is possible without a robust and sophisticated FP&A technology stack that allows for the complexity to be harnessed and leveraged as a competitive advantage. In this whitepaper, we are going to explore what this looks like practically and how the right decisions in terms of technology, workflows, and processes can make a world of difference to the outcomes.
The structure of a financial services company
If we take a peek behind the curtain of any financial services company and its FP&A activities, you’ll tend to see several common components that all work together to enable sophisticated planning and analytics at scale. This list is not exhaustive of course, because every company will operate with a slightly different structure, but it’s a good starting point to understand how these companies integrate information on products, import real data from day-to-day operations, and then run analytics to make informed forecasts about what’s going to happen in the future.
By breaking down the components into data sources, core models, and outputs – we can better understand the challenges, risks, and potential bottlenecks that might occur, for which we turn to technology to solve.
Wrangling data sources into a single system
At the heart of all FP&A systems is a base layer of infrastructure that aims to bring all data and information into one place from which it can be processed and leveraged. In a financial services company, this is typically the Financial ERP system that plays this role. The ERP system contains your standard accounting records but also a range of ancillary financial information relating to invoicing, expense reporting, staffing, and much more. There are tremendous benefits to combining these into a single data warehouse from which advanced analytics can be extracted. This is the heart of modern business and data-driven decision-making.
However, it doesn’t only have to be data that you’re collecting that can be useful here. Your data sources can also include models that the company is creating internally as we see with asset and liability models and funds transfer pricing models. These calculations are using first-hand data as their inputs, but are creating more context-specific and forward-looking information at an aggregated level that can then be fed into more advanced models in different parts of the business.
All of this is to say that the first port of call for any financial services business that is trying to get their FP&A in order is to wrangle all their data sources effectively and efficiently to move away from a conglomeration of spreadsheets and towards a single integrated view of their data that can enable true leverage of the information that is surging through the business. This is commonly understood in the industry of course, but the challenge comes with transitioning to this through the muddy waters of existing legacy systems that are not built for these purposes. That’s a topic for another paper, but it’s worth mentioning because often it’s not for lack of trying that large financial services firms struggle to implement this, but rather because it’s difficult to slow down organizational momentum and re-imagine what you might have built today if you could start from first principles.
The complexities of a driver-based revenue model
Predicting future revenue is one of the most important and challenging aspects of running any financial services company. All of your decisions about risk management, growth, hiring, product development, and the like are determined by where you think your revenue is going to end up in the next month, the next quarter, or the next year. It’s here, that companies must turn to a driver-based revenue model that can make predictions from the ground up.
A driver-based revenue model seeks to identify key variables that are correlated with revenue, understand what level of causality is plausible there, and then build predictions based on that real data. As opposed to taking an intuitive guess at what revenue might be, it makes a lot more sense to look at underlying quantitative drivers that allow for more accurate and nuanced revenue forecasts.
In the world of financial services, this can look very different depending on what type of product you’re talking about. Trading accounts will need to have different prediction algorithms than savings accounts or asset management accounts, based on the unique customer behavior and market circumstances that play a role in those different product verticals.
To give an example, in a trading account it might make sense to look at the average number of trades per day multiplied by the expected number of trading accounts. Under this simple heuristic, a company with 100 accounts and 20 trades per day would forecast 2,000 trades per day. This is very different from how you might think about savings accounts where your customer lifetime value might be more indicative of how well you’re going to leverage those lazy deposits over time – the true marker of where your revenue is coming from.
Building on the trading example again, one nice example for calculating your KPIs is to look at historical averages. For example, you might want to use a 3-month rolling average or a 6-month rolling average for your trades per day calculation. With the right system, this can be built right in so that the analyst can choose the duration and exclude any months that were outliers – making for more prudent use of statistical trending.
These debates are where the real work lies for FP&A professionals as they seek to investigate cause and effect without being misled by spurious correlations. And as we start to tweak these algorithms to include other important variables, it can be surprising how in-depth these calculations become. Once you integrate all the inputs, adjustments, growth percentages, and so on – you start to understand why having a single system that can manage all this data makes such a big difference.
We saw this firsthand with one of our clients who initially thought that their calculations were pretty straightforward, but eventually ended up having dozens of variables contributing to the final output. Business is complex and being able to see all the components of a calculation helps to improve the accuracy of the model and ensure that the assumptions made are the correct ones for the forecast cycle.
Understanding where adjustments are getting made in the model is an important feature, in order to be able to justify the forecast both internally and to regulators. And as the complexity increases, so does your reliance on software that can deliver the revenue forecasting capabilities that you need when you’re at scale.?
Leveraging driver-based expense planning
With a similar logic to what we explored in the previous section, driver-based expense planning is another key aspect of running an effective financial service provider. The simple truth is that the most reliable route to reliable profitability for financial service providers is to focus on the spread between what they’re bringing in and what they’re spending – finding small pockets of optimization along the way.
In this context, drivers might include things like the number of accounts, the number of households, inflationary pressures, and numerous other factors that influence cash outflows. And once again, the ability to choose the correct drivers for each type of cost and each product is what is going to determine how accurate those forecasts are.
Expense calculations should also incorporate data that are not first-order drivers such as account seasonality, year-on-year growth, and other top-end adjustments that might impact how expenses grow over time. Of course, not all drivers apply in all circumstances, so your technology must have the capacity to handle exception-based planning for certain products. This flexibility is key for a nuanced and sophisticated projection that has the best chance of approximating reality.
A driver-based expense model can also be very helpful for setting realistic targets and managing expectations along the way. If your model includes iterative calculations for EPS, this can assist with performance management to reach certain targets. Small changes can have large implications and it’s important to understand that internally first so that external communication is more precise and backed by data.
The power of a robust Net Interest Margin Model
As we’ve alluded to above, the profitability of a financial services company is dependent on a strong net interest margin between your deposits and your lending activities. It is incumbent on any company in the space to have a deep understanding of the yields, spreads, and rates that impact that business, as well as what they are likely to do in the future. In fact, almost all large companies have dedicated teams whose sole job is to make these predictions because of how important it is for long-term capital planning and risk management.
The Net Interest Margin Model is how these predictions come to life – pulling in disparate data sources and allowing for nuanced projections to be built that can be utilized by different teams within the organization. Typically, the FP&A team will work closely with the Group Treasury department to deliver a model that is tied to internal data about where the deposit and lending bases are at present and the economic indicators model that predicts the macroeconomic factors that are likely to impact things moving forward.
By building forecasts based on various Federal funds and expected mortgage rates, combined with drivers for trades and households, a financial service provider aims to arrive at a net interest margin model that is aligned to the risk-reward mandate of the shareholders, the affordability of the consumer base, and the long-term perspective on the macro-economic prospects of relevant jurisdictions.
This is an incredibly important model to get right because it feeds many other models, with the two most important being:
·??????? The Funds Transfer Pricing Model, which allows the institution to forecast both investments and loans, to understand how additional funding might impact the organization; and
·??????? The Asset Liability Model which integrates key data in order to forecast the balance sheet, focusing on key aspects like liquidity, solvency, and managing the various regulatory requirements.
The most effective way to manage these interrelated components is to have a single system that brings all these models into one place so that the reporting and analysis is standardized, complete, and contextual for all the users of that information. Here at Cubewise, we’ve implemented IBM Planning Analytics (TM1) for this exact reason and have focused a lot of effort on helping clients integrate Microsoft Excel, SPSS, and a variety of other systems into one place. The subsequent combined TM1 models that are possible can be used during large acquisitions to integrate financials and perform cost allocations, allowing for fast and standardized financials on the entire organization, as well as per-product cost allocations to get a better picture of profitability.
Planning for Adverse Scenarios
Financial service providers are more exposed to external market risks than perhaps any other industry because of the interactions with the financial system at large, and the reliance on consumer behavior to keep profitability high. As such, building in stress testing into the company’s workflows is crucially important. In order to maintain stability in difficult times, it requires a level of foresight to know that with the current company’s configuration, it could withstand a dramatic shock to the system.
What this means in practice is that financial services companies need to be consistently performing adverse testing based on catastrophic events that could crash the market temporarily, push inflation to very high levels, or a variety of other detrimental outcomes. In times of crises such as war, climate, natural disasters, and other large-scale events, proactive stress testing is what allows the company to protect investor money and remain in business.
Typically these stress tests are done in three categories, each at a different level of severity:
·??????? Baseline: Understanding the economic realities of a normal state of affairs.
·??????? Adverse: Exploring the implications of a serious negative externality that significantly disrupts normal business.
领英推荐
·??????? Severely Adverse: Imagining the worst-case scenario, one in which the company might be placed at severe risk of collapse, and how it responds to this adversity.
To build this level of stress testing into regular workflows, companies need to have multiple active versions of the same numbers to perform all the different scenario analyses, especially when different teams are running different simulations. IBM Planning Analytics shines here due to its powerful RAM-based calculation engine that allows updates to be performed in near real-time. It also makes it possible and desirable to work with different statistical models and tools to provide more robust and diversified scenario analysis.
Once these scenario tests have been run, the onus is on the leaders in those departments to make future-proofing decisions that build enough slack into the system to weather potential storms that may come unannounced.
Getting regulatory reporting right
As the heartbeat of the financial system, financial institutions face a tremendous level of regulatory scrutiny. Depending on the jurisdiction and the size of the organization, financial service providers must comply with an array of regulatory requirements that require robust, accurate, and audited filings to be made in a timely fashion. As such, a lot of effort goes into making the process more secure, streamlined, and efficient.
Having a unified financial system like IBM Planning Analytics can make these regulatory reporting workflows much smoother and more accurate, all while allowing for flexibility and adaptation as the regulations change. To illustrate this, let’s look at two different regulatory reports and how the right technology can make a difference.
Dodd-Frank Act Stress Testing (DFAST)
Every bank with more than $50b in assets has to comply with this stress testing and it has proven to be notoriously difficult and bureaucratic. It’s something that is highly scrutinized and goes to the OCC – meaning that financial services providers have to take it very seriously even though the data submission is awkward and outdated. As a result, it ends up being an expensive process because of the additional legwork that is involved, never mind the additional fines and other penalties that occur if there are any errors in your submission.
IBM Planning Analytics / TM1 thrives in these environments because it can collate data from a range of disparate sources and combine it in a way that can be meticulously reviewed before submission. Team members can inspect and understand data at any level of granularity to ensure that everything is above board, which significantly reduces the chances of an error creeping into the submission. Additionally, the platform can also connect directly to all the models that have inputs into the DFAST submission automatically, further streamlining the process and workflow.
Comprehensive Capital Analysis and Review (CCAR)
Another regulatory reporting requirement for larger banks is to submit various data to the Federal Reserve. This particular report is meant to ensure that the internal processes of financial institutions are sufficient to handle all of the relevant risks. By managing the quality of the controls in place, the Federal Reserve can discharge its responsibility in regards to protecting regular citizens who rely on a stable and robust financial system.
This filing also allows capital levels to be evaluated across institutions and compared to one another as the Federal Reserve continues to tweak its strategies to align with how the financial system is evolving. The reporting requirements are similar to DFAST but the filing has its own cycle, approval paths, and technical hurdles.
Once again, IBM Planning Analytics / TM1 offers powerful regulatory reporting capabilities that make this process a lot easier and less time-consuming, regardless of the level of complexity involved.
Integrating economic indicators
An FP&A system is incomplete (and sometimes dangerously so) if it is unable to integrate key economic indicators that influence the performance of the financial services company and the customers that it serves. These are core variables that feed many different models and forecasts within an organization and the ability to manage these effectively is a crucial driver of success.
IBM Planning Analytics / TM1 allows for a large in-built metrics cube that ensures that all groups doing forecasting and analysis are using consistent, approved rates. This cube can have additional security and business processes wrapped around it to control these fundamental drivers – so that the economic outlook of the organization is aligned across all departments.
The metrics that can be included here can vary by organization, but for financial services companies it will typically include things like:
·??????? The Federal Funds Rate;
·??????? The LIBOR rate (across different time periods);
·??????? Unemployment figures;
·??????? Market appreciation;
·??????? Housing market indicators; and
·??????? Inflation assumptions.
Having a single place to input these variables and having them impact all the models within an organization in real-time is a significant accelerant for stronger analysis and planning across a wide range of contexts. Getting it right offers a seemingly small quality-of-life improvement but it actually delivers much more distinct advantages than it first may seem.
Improving decision-making through better management reporting
Management reporting is a significant job in any organization with an ever-extending scope. This type of reporting must be in sync with regulatory reporting but it must also be geared toward the decisions being made to run the business on a day-to-day basis. In many cases, analysts will spend 3 weeks out of every month preparing a board book of financial performance with both forecasted and historical trends, for the various managers to review, discuss strategy, and perform other planning activities.
At a high level, there is generally one large report that includes all the models we’ve discussed above (and more) incorporated at a summary level.
With IBM Planning Analytics / TM1, it is possible to automate the creation of this report including all standard graphs, charts, and tables. It can even pre-populate some of the generic wording such as:
?“This product’s accounts have increased 20% over the last YOY quarter.”
The automation of many of these tasks significantly improves the speed and accuracy of this reporting, freeing up more time and resources to focus on the actual interpretation which is what actually drives decision-making is what is required to drive innovation and strong performance in our modern world where the speed of information is critical.
What IBM Planning Analytics can offer you
As you can see, the complexities of financial services demand a fully capable and integrated technology stack that can bring together all the data that is required to run a modern organization. IBM Planning Analytics / TM1 is the tool that can deliver this because of its key foundational features:
·??????? Simple Integration: Easily connect various data sources including existing spreadsheet-based planning, budgeting, and forecasting applications as well as ERP, CRM, and any APIs.
·??????? Powerful IBM Engine: Real-time, multi-dimensional data analysis guaranteed by a one-of-a-kind in-memory database and calculation engine built on TM1 technology.
·??????? Automated and Predictive Forecasting: Optimize your planning and deliver more accurate forecasts with simple-to-use, predictive, and AI capabilities.?
·??????? Flexibility and Scale: IBM Planning Analytics / TM1 and Cubewise provide a flexible and scalable solution that can adapt to First Tech’s business needs, accommodating expansion, changing requirements, and evolving market dynamics.
It’s a platform that can sit at the heart of your digital transformation as you concentrate your efforts on a single, unified system that enables data-driven decision-making at scale. If this is the sort of solution that your company needs, contact us here at Cubewise, and let’s start exploring what it could look like for your organization.
Message me directly to start a conversation! Linked in DM or email at [email protected]
General Manager @ Cubewise | IBM Planning Analytics with TM1py
6 个月Excellent article Robin.
SVB sure could've used some help in these matters.