Climate Attribution and the Need for Banking-Level Model Governance

Climate Attribution and the Need for Banking-Level Model Governance

As March 2024 drew to a close, headlines from the Met Office trumpeted yet another 'hottest on record' month. Yet for many Britons, the reality was a rather unremarkable, even chilly spring month. This disconnect between headlines and lived experience highlights a growing problem in climate communication: attribution overreach.

The media narrative has a problem with avoiding context when quoting a selective set of statistics with no discussion about natural variability. Crying wolf and presenting any change in weather not within the variability of the UK's climate but exclusively as a catastrophe creates a dangerous credibility gap that the populous will likely ignore warnings when real danger comes.

While the Met Office prioritises investment in supercomputers and AI for climate modelling, basic data quality issues still need to be addressed. With only 202 verified temperature stations and questions about the validity of additional readings, the foundation of their analysis deserves as much attention as their forecasting capabilities. This fundamental gap between data collection and analysis exemplifies a broader problem in climate science: the rush to complex modelling before establishing robust data governance.

This problem of data quality and analysis extends beyond temperature records to more complex attribution claims, particularly in extreme weather events. The recent Imperial College analysis of Hurricane Milton claims impressively that 45% of economic damage can be attributed to climate change. The path to this precise-looking number reveals data handling and analysis shortcuts that would never pass muster in regulated industries like banking, where trillions of dollars move daily based on rigorously governed models.

Imagine a complex process transforming raw hurricane data into attribution claims of the “never in history …” genre. Comparing today’s measurement techniques with the past is more art than science.? Historical wind speed measurements from the 1980s relied primarily on aircraft reconnaissance and limited satellite coverage. Today's sophisticated network of satellites, advanced radar systems, and continuous monitoring provide far more detailed and frequent measurements.

?When studies claim a 13% increase in hurricane wind speeds since 1980, they rarely acknowledge how improved detection and measurement capabilities might inflate this trend. Not getting a handle on the consequences of poor attribution mirrors the banking sector's hard-learned lessons about data quality and consistency that fed into unvalidated models leading to the 2008 financial crisis.? The fallout of a near collapse of the financial system led to strict requirements for data lineage - the ability to trace any number back to its source, understand every transformation it underwent, and document every assumption made along the way.

?The Imperial study about Hurricane Milton builds upon a reasonably sound model choice. Still, their claims rely on layers of assumptions - about pre-industrial conditions, calibration of damage functions, and directional causality between regional and global trends. Without rigorous documentation of these assumptions and their uncertainties, such precise-looking numbers risk misleading rather than informing policymakers.? The gap between caveats and conclusions exemplifies why we need more robust data and model governance of climate attribution claims.

?The financial services framework for model governance, brought forward once the banks could not attribute much of the trillions of dollars of losses, has its roots in the nuclear industry. The model framework has a sufficient track record, providing a proven template for climate forecasting.

Implementation could begin with a comprehensive audit of climate attribution models to ensure transparency in data sourcing, curation, processing, aggregation, and reporting. Every adjustment, whether for measurement technology evolution or sampling bias, should be explicitly documented and justified, just as banks must account for market volatility adjustments in their risk models.

?From there, the focus should shift to the mathematical integrity of the models, their assumptions, implementation notes, and any adjustments, preferably allowing independent verification of results and their sensitivities to model inputs where best guesses is prevalent.

Finally, comprehensive documentation of model choice, assumptions, weaknesses, and limitations of use would ensure that conclusions align with the underlying uncertainty in the analysis. While significant, the cost of implementing such governance pales in comparison to the potential cost of misguided policy decisions based on overreached attribution claims.

Climate measurement, prediction, and attribution are serious endeavours, and the climate industry needs to mature. Only by adopting similar standards can we ensure attribution studies serve their intended purpose: informing rather than overreaching, illuminating rather than oversimplifying the complex relationship between climate change and extreme weather events.

The stakes could not be higher - politicians use these attributes at face value to allocate trillions of dollars of taxpayers’ earnings. Without proper governance, we risk the misallocation of resources and the erosion of public trust in climate science. The time has come to bring the same rigour to climate attribution that we demand from other consequential analyses that shape our economic and social policies.

?

Vanessa Balmbra

Managing Principal | Financial Services and Sustainable Finance @ BIP

3 个月

Great piece on model governance. Coming from the flood risk world, I completely agree on the need for better data handling. When flood data is updated for lenders and insurers, even small changes need proper tracking and explanation. The same should apply to climate data.

Olivier MALPUECH

Portfolio and program manager, transformation manager, data, sustainability and risk governance

3 个月

I am amazed as to how much of my career in finance and risk management for the last 15 years can be summed up to data quality issue. How much boils down to the painstaking but critical exercise in understanding if the data is fit for purpose. The right scale, granularity, timeliness, accuracy, completeness... But that would require organisation understanding their data. And articulate their requirements.

要查看或添加评论,请登录

David Kelly的更多文章

社区洞察

其他会员也浏览了