Program Evaluation Reporting
Evaluation is generally implemented within a complex, dynamic environment of politics, stakeholders expectations, budgets, timelines, competing priorities, and agendas. Writing or receiving useful and compelling evaluation reports is always a challenge since reporting of evaluation findings are channeled through these same complexities.
Drawing on my experience writing project evaluation reports and my observations from reading several reputable sources that outline the process for creating program evaluation reports, I provide some key insights for good program evaluation reporting in this post.
Why do we need a program evaluation report?
An evaluation report is a written document that describes how you monitored and evaluated your program and answers the “What,” the “When,” the “How,” and the “Why It Matters” questions. The ability to demonstrate that a program has made a difference can be crucial to its sustainability. A written report fosters transparency and promotes use of the results. Use of evaluation results must be planned, directed, and intentional (Patton, 2008).
An evaluation report is needed to relay information from the evaluation to program staff, stakeholders, and funders to support program improvement and decision making. The evaluation report is only one communication method for conveying evaluation results. It can be used to facilitate support for continued or enhanced program funding, create awareness of and demonstrate success (or lessons learned from program failures), and promote sustainability. Torres, Preskill, and Piontek, (2005, p. 13) contend that there are three reasons for communicating and reporting evaluation results: Build awareness and/or support and provide the basis for asking questions. Facilitate growth and improvement. Demonstrate results and be accountable.
Several elements are needed to assure that an evaluation report fulfills its goals. These elements are to (1) collaboratively develop the report with a stakeholder workgroup; (2) write the report clearly and succinctly with its intended audience in mind; (3) interpret the data in a meaningful way; and (4) include recommendations for program improvement.
The following elements are good to consider for program evaluation reports writing.
1. Intended use and users. A discussion about the intended use and users fosters transparency about the purpose(s) of the evaluation and identifies who will have access to evaluation results. It is important to build a market for evaluation results from the beginning with a solid evaluation plan and collaboration with the evaluation steering committee (CDC, 2011). In the evaluation report, it is important to remind your audience what the stated intended use is and who the intended users are.?
2. Program description. A program description clarifies the program’s purpose, stage of development, activities, implementation context. It presents the theory of change driving the program, in addition to a narrative description.
3. Evaluation focus. This element documents how the evaluation focus was narrowed and presents the rationale and the criteria for how the evaluation questions were prioritized. Evaluations, however, are always limited by the number of questions that can be asked and answered realistically, the methods that can be employed, the feasibility of data collection, and the resources available. The scope and depth of any program evaluation is dependent on program and stakeholder priorities, available resources including financial resources, staff and contractor skills and availability, and amount of time committed to the evaluation.
领英推荐
4. Data sources and methods. Methods and data sources used in the evaluation should be fully described in the evaluation report. Any approach has strengths and limitations; these should be described clearly in the report along with quality assurance (QA) methods used in the implementation of the evaluation.
5. Credible Evidence: A clear credibility of evaluation information. The credibility of the evaluator(s) can have an impact on how results and conclusions are received by stakeholders and decision makers and, ultimately, on how the evaluation information is used. Patton (2002) included credibility of the evaluator as one of three elements that determine the credibility of data. Consider taking the following actions to facilitate the acceptance of the evaluator(s) and thus the evaluation:
a) Address credibility of the evaluator(s) with the the evaluation management team early in the evaluation process.
b) Be clear and transparent in both the evaluation plan and the final evaluation report.
c) Present periodic interim evaluation findings throughout the evaluation to facilitate ownership and buy-in of the evaluation and promote collaborative interpretation of final evaluation results.
d) Provide information about the training, expertise, and potential sources of biases of the evaluator(s) in the data section or appendices of the final evaluation report.
6. Results, conclusions, and interpretation. Often, the interpretation section is missing in an evaluation report, thus breaking a valuable bridge between results and use. Justification of conclusions includes analyzing the data collected, as well as interpreting and drawing conclusions from the data. This step is needed to turn the data collected into meaningful, useful, and accessible information. Engage the evaluation team this step to assure the meaningfulness, credibility, and acceptance of evaluation conclusions and recommendations. Meet with stakeholders and discuss preliminary findings to help guide the interpretation phase. Stakeholders often have novel insights or perspectives to guide interpretation that the evaluation staff may not have, leading to more thoughtful and meaningful conclusions.
7. Use, dissemination, and sharing. Ensuring use of evaluation results, sharing of lessons learned, communicating, and disseminating results begins with the planning phase and the development of the evaluation plan (CDC, 2011). It is often thought that this step will just take care of itself once the report is published; however, evaluation use is likely when it is planned for and built into the six steps in the evaluation plan. Planning for use is directly tied to the identified purposes of the evaluation and program and stakeholder priorities. Remember that stakeholders will not suddenly become interested in your product just because you produced a report; you must sufficiently prepare the market for the product and for use of the evaluation results (Patton, 2008). A clearly written and comprehensive evaluation report can help ensure use. An executive summary can also be a useful tool in summarizing the evaluation and results for audiences that need a quick overview.
8. Plan for Communication and Dissemination. The evaluation results may not reach the intended audience with the intended impact just because they are published. Intentional communication and dissemination approach should be included in the evaluation plan and report. To achieve this outcome, evaluation results must be translated into practical applications and the information systematically distributed through a variety of audience-specific strategies.
9. Adhering to evaluation standards. The evaluation standards will enhance the quality of the evaluation by guarding against potential mistakes or errors in practice. The evaluation standards are grouped around four important attributes: Utility: Serve the information needs of intended users; Feasibility: Be realistic, prudent, diplomatic, and frugal; Propriety: Behave legally, ethically, and with due regard for the welfare of those involved and those affected; Accuracy: The evaluation is comprehensive and grounded in the data. (Sandars & The Joint Commission on Standards for Educational Evaluation, 1994). The evaluation report should address the application and practice of these standards throughout the evaluation. This will increase the transparency of evaluation efforts and promote the quality and credibility of implementation of the evaluation. It is important to remember that these standards apply to all steps and phases of the evaluation.
Many thanks for reading. I look forward to your comments and contribution!
CEO & Co-Founder | Accredited Trainer & Consultant | PgMP?, PMP?, CAPM?, ITIL?4, PRINCE2?7
1 年Armel, thanks for sharing!
Certified VSLA and Child Labor Trainer. Project Management Expert. Sustainability professional.
2 年Mnay thanks to you for this insights. hoping to have more of this educational tool.