Building a Trusted Climate Data Foundation: Reflections from our Conference at Stanford
Nick Hart, Ph.D.
using data to make the world better | Signal: nickhart.13 | Anonymous: safe-track.org
When I was 15, I worked for the Boy Scouts in mid-Missouri, teaching the environmental science merit badge at summer camp. Using standardized materials and activities, I introduced young scouts to concepts like the ozone layer and greenhouse effect. Years later, as I pursued my own graduate training in environmental science, I began to recognize a pattern: while our scientific understanding has advanced tremendously, many of the fundamental data challenges remain the same: how do we present complex information clearly? How do we build trust in the data? How do we translate knowledge into action?
Our team at the Data Foundation helped organize an event at Stanford University yesterday, a conference on "Increasing Accessibility to Trusted Climate Performance Data" organized in partnership with Stanford Law School and the Doerr School of Sustainability. These persistent questions were at the center of our discussions. As the leader of the Data Foundation and as an environmental scientist by training, I see how despite our sophisticated models and technologies, the human elements of trust-building, standardization, and accessibility continue to be our most significant hurdles in making climate data truly actionable.
Starting with the Value Proposition
During a fireside chat with me in the afternoon, Dr. Julia Lane brilliantly articulated that effective data infrastructure begins with identifying what people actually need. In climate, the value is clear -- mayors need localized emissions data to guide city planning, investors need performance metrics to allocate capital toward effective solutions, and companies need measurement frameworks to validate their climate commitments.
What struck me throughout the discussions was how familiar these challenges are. I’m trained in public policy, economics, and environmental science, so I've seen similar patterns across domains - the technical hurdles of standardization, the tension between proprietary interests and public good, the challenge of translating complex data into actionable insights.
The issues facing our climate discourse aren't necessarily new or as unique as we sometimes make them out to be, but they do require focused attention and strategic investment.
Building Community Around Shared Goals
Dr. Arun Majumdar, Dean of Stanford's Doerr School of Sustainability, powerfully emphasized in his remarks, "There's no single institution around the world that can address this large problem of this magnitude, of this urgency and this complexity." He reminded us that "this is about coalitions" and "making one plus one greater than two" - a perfect encapsulation of why our collaborative approach is so essential.
Throughout the day, we heard from pioneers creating this foundation in three critical areas:
- Methane mitigation, where satellite and sensor technologies are revolutionizing detection
- Carbon dioxide removal, where new verification approaches are creating transparency in an emerging field
- Forestry, where rich datasets exist but need better integration and accessibility
What unites these efforts is community - bringing together stakeholders around common objectives. One concrete example is the partnership between the Data Foundation's Climate Data Collaborative and Crosswalk Labs, which we announced during NYC Climate Week last September. Through open.crosswalk.io, we're now providing free, accessible neighborhood-level emissions data for every U.S. census tract through 2023.
The Data Foundation-Crosswalk Labs collaboration exemplifies what's possible when we democratize climate data. As Jason Burnett demonstrated during the conference at Stanford, this tool enables local officials, community leaders, and citizens to access granular emissions information that was previously available only to well-resourced communities. By integrating multiple sources of activity-based data to generate verifiable emissions estimates at the neighborhood level, we're empowering over 10,000 communities to develop data-driven climate strategies, regardless of their resource levels.
This is exactly the kind of practical coalition-building that transforms how climate decisions are made - bringing sophisticated data analysis into the hands of those who need it most, and doing so through open, accessible platforms that foster transparency and trust.
Defining Our Theory of Change
Dr. Lane emphasized that successful data initiatives need a clear theory of change - defining inputs, activities, outputs, and outcomes. In climate, our inputs are increasingly sophisticated measurement tools; our activities include standardizing protocols and sharing frameworks; our outputs are trusted datasets and analytics that decision-makers can access; and our outcomes are measurable changes in indicators of climate progress (reduced emissions, increased carbon sequestration, etc.) that ultimately lead to our desired impacts of mitigating climate change and building resilience.
As she wisely advised, "Get runs on the board" before attempting to scale. The successful examples we heard yesterday - from pediatric cancer data commons to methane measurement networks - all began with modest, focused initiatives that demonstrated value before expanding.
Dean Majumdar reinforced this point with his powerful reminder about the BP oil spill, where public and open data proved crucial to understanding the true scale of the disaster. In fact, that’s our mission at the Data Foundation – prioritizing open data and evidence-informed decision-making.
Beyond Politics: A Matter of Health and Community
What resonated most deeply for me is that climate data transcends politics – it can and it should. When we talk about methane leaks, forest health, or carbon removal, we're ultimately discussing the well-being of our communities and ecosystems.
If health and community are truly shared values in our society, then building robust climate data infrastructure isn't optional - it's imperative. This requires planning and strategic investments that align with what markets are already signaling they need. Government can support or it can shirk, but at the end of the day if the market identifies a value and the community converges on a need then there can be a path forward.
Looking Forward
Yesterday was just the beginning. There are numerous adjacent and related fields -- like lifecycle analysis, which has profound implications for current and future tariff policies -- where similar data infrastructure challenges await.
I'm deeply grateful to Stanford University for hosting this conversation in partnership with the Data Foundation, to David Hayes for his visionary leadership in bringing these communities together, and to Dr. Julia Lane for providing a clear framework to guide our collective work. I'm also thankful to the Data Foundation’s Climate Data Collaborative team – Ryan Alexander and Sonia Wang -- for their tireless efforts in making this event a success.
The path forward will lead us to focus on the practical work of building strategies and data systems that serve decision-makers at all levels. We must start small, demonstrating value, and growing methodically.
My hope – just as it was when I was 15 teaching to the young scouts all those years ago – is that when done right, this effort creates the foundation for a healthier, more resilient future for all of us – and those who follow.
NICK HART, PH.D. is President & CEO of the Data Foundation. Learn more about the Data Foundation’s Climate Data Collaborative at www.climatedatacollaborative.org.
#ClimateData #Sustainability #DataInfrastructure #ClimateAction
?
Great work in making complex climate data more understandable and accessible. This will help a lot of people take action Nick Hart, Ph.D..?