Beyond Predictions: NZ can save millions with a new Budget Process
Photo by Hulki Okan Tabak

Beyond Predictions: NZ can save millions with a new Budget Process

In the week since the budget, we have seen another 500 jobs gone from the public sector, bringing the running total to 6000. That represents some serious savings. As I looked through the 24/25 budget, I asked myself, what was being done to produce cost savings in the budget process itself?

This is a process that involves funding initiatives from one million to several hundred million dollars, and given that it was established in 1989, seven years before the first NZ government website, opportunities might well exist.

The budget process is a meticulously planned cycle involving prioritisation, analysis, and lots of consultation. Each year, ministries and departments submit detailed budget bids to secure funding for their initiatives. Despite these efforts, the accuracy of predicting the outcomes of these investments often falls short, leading to inefficiencies, wasted resources, and a lack of outcomes.

Here lies the issue. We base the funding decision on experts' forecasts, but it can take years to determine if the funded initiative successfully produced the outcome. At this point, the project team delivering the work has changed, as has the Minister, the government, and public awareness, not to mention the technology they were basing it on.

The alternative is to build a very small version of the proposed solution/service/initiative to demonstrate value. In 2024, doing this is no longer a huge, expensive task. Instead, it can often be done for less than the cost of the original business case or budget bid.

I have seen and been involved in several examples that have taken this approach here in NZ and delivered great results. For example, MPI is investing $68 million to install cameras on up to 85% of commercial fishing vessels, an initiative that began with a $40k investment. It proved its value and then expanded iteratively.

The status quo process still works for things we know how to build where the outcomes can be reasonably well known. We can see demand for building a bridge in the data, and we know how to build one. We can be confident that if we follow a given plan, it will deliver a given outcome. However, when it comes to new technology projects, those known factors don't exist. Instead, we rely on people's judgement and the forecasts of ‘experts’. The problem with ‘experts’ is that they produce terrible predictions of a given outcome when dealing with such uncertain factors. Simply put, if they haven't done it before, they don't know that it will work.

Big claim, you might think. Researcher Philip Tetlock conducted a comprehensive study over 20 years involving 284 experts who made 28,000 predictions about political and economic outcomes. Tetlock found that these experts' predictions were only slightly better than random chance. What about the data they use as evidence for the bids, you ask? What about the huge volume and seemingly robust nature of data collected for the 2016 US Presidential and 2015 UK general elections, yet got the outcome completely wrong? My hunch is the decision-makers assessing the bids take the presented data as gospel. But consider the budget bid author, what point they are trying to make, what data they wish to present, and what they will be set to gain if the bid is successful. These bids, by the way, are often written by or with Big 4 consultants, who of course, will continue on with the project once it’s funded.

On balance, the budget bid process might have served us well over the last 35 years, but it's no longer 1989. The process we have now doesn’t leverage the opportunities that modern technology development and delivery models allow, e.g., the opposite of slow, big, expensive, bespoke, and proprietary platforms, which is what the government is so used to and seems to now desire. If we change how we assess budget bids, we can easily save costs, deliver more value, and fail less often.

We need bids to show that they have started small, gathered real-world data, and iteratively built and expanded based on successful interventions, showing a pathway to a nationally scalable solution/product/intervention. Then, decision-makers will have significantly more certainty about making investments that will actually produce the desired outcome than if they were basing their decisions on theoretical predictions. Wellington already has an active and innovative tech scene that is more than able to support this type of iterative development model.??

But if we want to make this happen by design, we need to make a shift at the policy and process level. If agencies know that funding decisions are based on real-world testing and data with measurable outcomes, then that is what they will provide. The government created the current system and has the ability to change it.

Umar Sheraz

Futures Researcher ~ Learning through Gaming ~ Facilitator ~ Storyteller ~ Climate Change Enthusiast

9 个月

Beautiful article...A word of caution about "small, gathered real-world data, and iteratively built and expanded based on successful interventions, showing a pathway to a nationally scalable solution/product/intervention". Nearly a decade ago while working on e-health in Bangladesh I came across a term called pilotitis. While it is differently used, in our context it meant that a small scale project succeeds and then when it is scaled to a national level, it fizzles out. Bill Gates is especially irritated by this phenomena as a lot of his successful pilot projects failed at bigger, grander levels. Any thoughts on how you or other colleagues have addressed this??

回复
David Cameron

Founder/CEO at LearnCoach

9 个月

Great write up Jonnie. Seems like a no brainer

Julie Watson

Digital Practice Lead - Kaiarahi Hua Mahi

9 个月

Thanks Jonnie- in my area of work it needs to be less about "what tech can I buy" and more about "what is the problem I'm trying to solve?" But starting with design thinking and experimentation doesn't fit with the business casing process.

回复
Alan Hucks

Venture Development | Research Commercialisation | Climate Tech | Creative Technology | Digital Assets

9 个月

Some good assumptions here to test. How would you go about doing these experiments on process?

回复
Jasmin Wilkins

I help organisations identify and target business value from IT enabled business change

9 个月

Thanks Jonnie - interesting, and this aligns with a conversation I had yesterday, about the need in the current environment to genuinely demonstrate value (and viability!) early and often. Having effective and honest feedback loops that inform future progress might also be a key element - show me what it could be, build and deliver what it is, and tell me how it is *really* being used and delivering value ....! And let's not forget to capture the unintended value that sometimes exceeds the planned value!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了