Is evaluation stuck? Challenges for Evaluating Place-Based Approaches
Jess Dart, Founder and Chief Evaluator at Clear Horizon

Is evaluation stuck? Challenges for Evaluating Place-Based Approaches


Evaluation holds the promise of helping us learn about what is/is not working so we can adapt and achieve more impact. It also holds the promise of helping us tell the bigger story of place-based work for both accountability back to the community and for advocacy. And yet, this promise is often not realised.

Last year, at #ChangeFest, I facilitated a panel to consider the challenges of evaluating place-based approaches. The panel consisted of Kylie Burgess and Tanya Brooks-Cooper , who play the role of MEL leads in place-based approaches in Tasmania, and Prerna Mehrotra , a MEL practitioner who sits within a Victorian govt. team charged with evaluating a multiple-site place-based initiative, and Danielle Campbell , who sits within a MEL team at 澳大利亚拉筹伯大学 . This panel discussed the challenges and bright spots in evaluation for place-based change and asked some hard questions – such as ‘Is evaluation of place-based approaches stuck?’ and ‘Is government-driven evaluation holding problems in place?’

When evaluation is strongly influenced by government needs and priorities it can be experienced negatively by communities. It can reinforce the inequalities in power that we are trying to shift. It can undermine the model of community-led change. Additionally, there can be a heavy evidence burden for place-based initiatives to prove that what they are doing is working. While our panel explored ways to minimise the burden on communities while “feeding the beast” (meeting reporting requirements) , we were more excited by the prospect and examples of community-led evaluation work – where evaluation is done for, by, and as a community. Danielle Campbell shared an inspiring example of First Nations funded and led evaluation by Warlpiri in Central Australia based on a very different evaluation approach grounded in Warlpiri worldviews, language, and culture. But we all noted this approach was rare.

How might we embrace community-led evaluation that is nurturing and useful for communities whilst negotiating and delivering a minimum level of evidence required by government and funders at the same time? What does it take to do a two-world evaluation?


Population-level indicators can be a distraction as they take years to shift. Final welfare outcomes for people are without doubt an important north star for place-based work. However, an over-emphasis on population-level indicators, which do not shift quickly from year to year, may result in a loss of momentum. Linked to this is a need to capture and communicate evidence-based stories that show the impact and progress of place-based work in the shorter term.

How might we hold population indicators lightly while also collecting and using data that helps us navigate and show progress in the short term?


The panel discussed that complex MEL frameworks can be overwhelming and too challenging to implement. An example was provided by Kylie Burgess , who explained how a consultant had developed a MEL plan, but people didn’t fully understand it and were not using it to guide the MEL work. For a time, she put the plan to one side. Instead, she started where the community was, helping them get clearer on the theories of change at the local level and dropping the jargon. Later, she was able to link back to the MEL plan and use it to support the MEL needs of the various initiatives.

How might we hold our evaluation frameworks more lightly or grow them more slowly so that we can start where the community is and support rather than impede community efforts?


Widening the focus on evaluations from outcomes to also evaluate principles and processes. Prerna Mehrotra shared the challenge of working in partnership across diverse sites, where each community had their own, unique goals. She offered a possible solution in terms of supplementing outcomes evaluation with process evaluation. The process evaluation involved evaluating the work, and how all partners (including the government) showed up against a set of practice principles. The panel went on to propose that the work of place-based approaches is really about developing the conditions for change, and they felt it is more congruent to evaluate them more on this basis.

How might we re-think accountability to a wider focus beyond outcomes, to be more principles and process-focused?


A fifth challenge was raised by an audience member around the ability of communities to access the data they need to diagnose and track services. If we can’t share the data, this can exacerbate the issue of over-consultation: multiple services asking the same questions over and over again. This also raised questions about overburdening community groups who may not be equipped to work with messy data sets.

How might we help more communities gain access to service data at the local level, while at the same time building capability for community-based groups to analyse this data at the local level?


A final question for you: which of these challenges do you resonate with the most, and what have we missed? We particularly want to hear from communities, practitioners and funders working in/with place-based approaches.

Melinda Craike

Associate Director, Research Engagement and Impact and Professor of Physical Activity and Health

10 个月

Thank you for this piece. The key messages resonate with my own experience and it is great to see them so clearly articulated. I totally agree that there is too much focus on population level data, which is influenced by many, many factors and takes years to shift. Using service-level data to identify trends, learn and improve makes sense when evaluating place-based approaches. However, there seems to be little focus on a coordinated approach to collecting such data. The need to build the capacity to analyse data at the local level is essential, but also the capacity to use the data for improvement is equally important.

回复
Froukje (Frankie) Jongsma

Principal Consultant Social Impact

1 年

Centre for Human Security and Social Change I thought you may be interested in sharing given Danielle is featured

回复
Pauline Burt MBE, FRSA

Creative sector strategist, systems thinker and change-maker focusing on sustainable practice. Helping leaders to adapt how they work for better business, environmental and social impacts.

1 年

Thank you for sharing these reflections and prompts. They all resonate with me. It seems that what’s valued (by governments/funders) gets measured, but so often they don’t consider the impact of process and principles, the ‘how we work together’ questions that provide opportunities for all partners to contribute and shape a project/place/community, creating the conditions of change. I’d love to see more integration of bevarioural science. And, early questions about what people want and what would need to happen to facilitate that that all partners engage in together.

Laura Barnes

Process is the Purpose

1 年

Deep gratitude Jess, for being us again this year. Can’t wait to see what comes out of CF24!

Anna Powell

CEO of Collaboration for Impact

1 年

Thank you for walking the talk of the ChangeFest principles Jess!?It was a cracker of a session at CF23 and I’m sure it will be again at CF24.?

要查看或添加评论,请登录

Jess Dart的更多文章

  • Rethinking Evaluation to Support Systems Transformation

    Rethinking Evaluation to Support Systems Transformation

    In October 2023, Clear Horizon launched the Systems Transformation Unit. The purpose of this unit is to actively…

    50 条评论
  • Principles for MEL [version 3]

    Principles for MEL [version 3]

    Jess Dart, Chief Evaluator, Clear Horizon. Measurement, Evaluation and Learning (MEL) is on the rise.

    12 条评论
  • aes23 Meanjin - Day 3 Reflections

    aes23 Meanjin - Day 3 Reflections

    Katherine Trebeck, a political economist and advocate for economic system change, kicked off the final day with a…

    13 条评论
  • aes23 Meanjin - Day 2 Reflections

    aes23 Meanjin - Day 2 Reflections

    Continuing the theme of Lifeview that Maggie Walter Kicked off on day 1, Donna Mertens implored us to examine our world…

    2 条评论
  • aes23 in Meanjin - Day 1 Reflections

    aes23 in Meanjin - Day 1 Reflections

    So aes23 has kicked off with a provocative and sparky day. I love these conferences! After being welcomed to Country by…

    4 条评论
  • Principles for Measurement, Evaluation and Learning (MEL): Round 1 reflections

    Principles for Measurement, Evaluation and Learning (MEL): Round 1 reflections

    A few weeks ago, I posted a draft set of principles for Measurement, Evaluation and Learning (MEL). I was staggered by…

    8 条评论
  • AES 22 conference reflections - day 3

    AES 22 conference reflections - day 3

    The final day of #AES22ADL kicked off a provocative session by Amy Gullickson, talking bravely about soul and…

    2 条评论
  • AES 22 conference reflections - day 2

    AES 22 conference reflections - day 2

    The first theme that stood out for me on day two of the conference, was that of government learning. Katina D'Onise…

  • Shared Measurement Versus Evaluation

    Shared Measurement Versus Evaluation

    Is shared measurement really so critical for collective impact or Place-Based approaches? Where does evaluation fit in?…

    1 条评论
  • Come join me for a day of Developmental Evaluation

    Come join me for a day of Developmental Evaluation

    Still Places left at our Masterclass in Melbourne on Developmental Evaluation, Friday 24th. Should be fun, we are going…

社区洞察

其他会员也浏览了