4 Reasons Why Nobody Reads (or Uses) Your Evaluation Report:Here's How To Fix it

4 Reasons Why Nobody Reads (or Uses) Your Evaluation Report:Here's How To Fix it

Finally.  After many months of hard work and thousands of dollars spent, you have an evaluation report that confirms how great your programme is. All these findings that will be of value to so many people.

Being all excited you send the report to  everyone; the funding agency, all your partners, staff and other relevant stakeholders. You post in on social media and share the information on your LinkedIn page and other online networking fora and platforms. You may even hold a nice press conference and have an official launch.

Then you sit back and wait for everyone to respond on how great the report was. Persons will use the findings to improve their operations, policies will be influenced by evaluation findings and the donor and the public will continue to support your great programme because the independent evaluation confirmed what you knew all along; that your work makes a real difference.

A week goes by. Nothing. Then several weeks and months roll by and still little or no reactions to the report. You figure that people are busy, they will read it eventually. Right? 

Wrong! If it takes that long to receive adequate feedback, the truth is that your evaluation report was dead before the first word was even written. Very few persons read it and even fewer persons will ever use the findings.

The following list discusses the reasons why your evaluation report may have landed in the figurative graveyard. 

1. Failure to engage stakeholders from early on which...

spells doom for your report because you never garnered enough interest or commitment. You waited until after the evaluation was conducted to approach the relevant stakeholders. This is too late.

You missed the boat, my friend

As soon as an evaluation is conceived, all the concerned parties should be involved as much as possible in each stage.  For example, during the planning phase, stakeholders can help determine the intended use of the evaluation findings, the scopes of work and the methodology. During the later stages of the evaluation stakeholders can review interim findings and contribute to recommendations.

Research has shown that persons are more inclined to support initiatives that they participated in. Naturally, someone would want to at least scan the final report as they played some part in its development. Humans have a natural curiosity to know how the story ends. So engage persons from the start to increase the likelihood that they stick around for your grand finale; the evaluation report.

Another pitfall is that...

2. The Evaluation report was written like an Encyclopaedia...

filled with technical jargons, high-flown language and complex terms. Unless your evaluation report is being submitted to an academic journal for publication, simplicity is best.

You know it is bad when even persons from ancient Egypt can't decipher your hieroglyphic report!

Be concise, use clear language and stick to terms that potential readers will easily understand. You may have to balance having an evaluation report that satisfies the donor reporting requirements and addresses the needs of the other stakeholders. Which leads to the second reason your report got little traction. 

3. Failure to identify your target audience...

which results in not pitching your evaluation report accordingly.

Not knowing your target audience may have deadly consequences

An evaluation rarely serves the needs of one stakeholder. The donor, implementing  agency, beneficiaries, policy-makers, non-profit organisations, the government, the general public and others all stand to derive value from an evaluation report. Nevertheless, most evaluation reports are written with just one stakeholder in mind. You guessed it, the donor.

As such, these traditional evaluation reports are usually long, lifeless, lacklustre documents that satisfies the donor reporting requirements. It ticks all the right boxes, but is of little appeal to other stakeholders who have different interests.

For example,  the programme staff may be interested in  the evaluation findings that relate to the operational aspects of the programme, while policy makers are keen to hear more about the impact and effectiveness of the programme. 

In other words, before the first word of the document is written, you should have already determined; who the intended users of the evaluation are and how they will use the findings.

Once this is established, then action-oriented reports can be written to serve each of these target group. An action-oriented report "is intentionally shorter than a traditional formal report and is focused, simple, and geared toward a particular audience". (Hendricks; 1994).  Action reports can take different formats, that is, whether written, verbal or electronic.

 Different formats for reporting on evaluation findings

 Table reproduced  from 'Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings', Centers for Disease Control and Prevention, 2013

The essence of action reports are that they home in on the specific area of the evaluation to suit the interests of a particular target audience.  

For example, it is likely that the general public is more interested in how many children benefited from the programme rather than how the programme was audited twice.  As such, the action-oriented report would highlight these outputs and leave out the other financial information on the operational expenditures. 

4. Failure to have a dissemination and marketing strategy

So you avoided the pitfalls mentioned so far. You have engaged the relevant stakeholders and have their commitment. Your document is in simple, clear and easy to understand language. Plus, you have several shorter action-oriented reports to suit the different stakeholders. Then why is the response to the report so lukewarm?

If a tree falls in a forest and no one hears, does it make a sound?

 The sad truth is that doing the above things are not enough to get your evaluation report read (or acted upon, which is even harder). You have to get the message out through a channel and on a frequency that your target audience uses and understands. You share the document on LinkedIn, but not everybody is on this professional networking site. Or you exclude social media all together because it is not your thing. Completely ignoring the fact that a segment of your target group uses social media as their preferred means of communication.

 In other words, you have to make a sound and get your message heard by others. You cannot take a haphazard approach to dissemination. If you are to be successful in having your evaluation report read and used, you will need a structured approach. An approach which has a concrete dissemination plan which addresses the following questions:

  • Who is the target audience?
  • What medium will you use to disseminate findings—hard copy print, electronic, presentations, briefings?
  • When is the best time to disseminate the Report? Perhaps to coincide with a special event?
  • How, where, and when will findings be used?
  • Who is responsible for dissemination?
  • What resources are available to accomplish the work?
  • What are the follow-up activities after release?
  • How will follow-up activities be monitored?

You should also consider using different reporting formats for your different target audience. Do you really expect the average community member or the very busy board member to read a 200 page traditional comprehensive report? An action-oriented report that addresses the issues that matter to these two stakeholder groups will be more effective. For example, give the board member a dashboard report that highlights the main figures that he or she needs to inform his or her decisions at the board meetings.

 Table reproduced  from 'Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings', Centers for Disease Control and Prevention, 2013

Hope the above tips help to resurrect your evaluation reports. Do share any additional tips and your experiences in the Comments section below.

See my other article for examples from my own personal experiences on how an organisation got persons to go beyond reading the report to actually using it.

If you found my article useful, please remember to 'Like',  share on social media and/or  hit the 'Follow' button to never miss an article.

 Publications consulted for this article:

Centers for Disease Control and Prevention. Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings. Atlanta, GA: US Dept of Health and Human Services; 2013.

Hendricks M. Making a Splash: Reporting Evaluation Results Effectively. San Francisco, CA: Jossey-Bass; 1994.

Patton MQ. Utilization-Focused Evaluation. Thousand Oaks, CA: Sage Publications; 2008.

About The Author:

Ann-Murray Brown is a Monitoring, Evaluation and Learning  (MEL) expert who provide consultancy services for various organisations.  Do you need support to solve your M&E needs? Get in touch with her via  www.annmurraybrown.com 

To view all her articles go to https://www.dhirubhai.net/today/post/author/posts#published?trk=mp-reader-h 

Thank you and Congrat... When disseminating evaluation results by different methods or means, I think it's best to use different languages to make them more accessible.

回复
Wasili Mfungwe

Investment Professional, VC, PE, Project Finance

8 年

Interesting article, I am not in the M&E space however I am concerned with some of the M&E techniques in monitoring and evaluating corporate strategy implementation. My audience is generally busy executives which will not read any bulky report. Do you have any samples of One Pagers and Dashboards that can be adopted? Thank you

回复
Sachin Supekar

Development Sector Professional

8 年

Truely delightful article. Always waiting for your post Ms.Ann.

回复

Great synopsis--thank you-- i am encouraging others to read this blog entry. Well done!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了