The Challenges and Constraints of Evaluations
Centre for Effective Services
Research | Evaluation | Implementation | Programme management | Connecting evidence with policy across Ireland and NI
Welcome to the fourth blog in our series Demystifying Evaluation from Karl McGrath and Dearbhaile Slane .??
Here we look at some of the common challenges and constraints that can arise during the evaluation process, and some potential strategies you can use to overcome or mitigate them. Understanding the challenges and constraints of an evaluation are crucial factors that feed into its design and implementation.?
1. Budget Constraints?
Evaluations do not have unlimited funding and ‘budget constraints’ are about keeping the costs of the evaluation within its budget.??
Conducting robust evaluations that produce trustworthy findings can be resources intensive.?
Sometimes budget constraints and robustness come into conflict and trade-offs must be made. The budget might be too small to apply the ‘best’ evaluation designs and methods, but the evaluator still needs to produce findings that are meaningful and trustworthy.?
Bamberger and Mabry suggest the following strategies to reduce the cost of an evaluation:?
CES Case study?
In 2023, CES completed an evaluation of the Community Healthcare Network (CHN) Learning Sites on behalf of the HSE. There were 9 learning sites around the country and the evaluation collected data from each learning site at 3 different timepoints.??
In the first two timepoints we facilitated a series of focus groups in each learning site, amongst other data collection methods. ?
The third timepoint had a lower budget than the previous two timepoints, which meant we had to find ways to reduce costs. We had gathered so much rich qualitative data in the first two timepoints that one way we managed costs was by reducing the number of focus groups in the third round of data collection from 9 to 5 learning sites. This simplified our evaluation design, reduced our sample size and reduced the cost of data collection and analysis, while still allowing us to work with a diverse sample from different parts of the country. ?
2. Time Constraints?
Evaluations will have a definite timeframe and the timing of an evaluation may not always be ideal for answering the questions of interest. ‘Time constraints’ are about conducting the evaluation within its agreed timeframe, or when its timing is not ideal. ?
In CES, there are at least three common time constraints we come across:?
Some of the strategies suggested by Bamberger and Mabry to manage the time constraints are similar to those for managing budget constraints, and include:?
To this, we would also add, based on our experience:??
CES Case study?
In a recent literature review for Tusla, we were asked to conduct a systematic review. A first draft of the literature review was to be completed within 6 months. Systematic reviews are very structured and robust literature reviews. However, this means they take a long time to complete, usually somewhere between 12-24 months.??
To accommodate the 6-month timeline, we conducted a ‘rapid’ review. This is similar to a systematic review but the methods are streamlined to allow it to be completed more rapidly. Our approach here was to simplify the design of the review by involving fewer people at each stage. This also we could be more time-efficient in our data collection and analysis methods.?
3. Data Constraints?
Evaluations often have to ‘make do’ with gaps or limitations in the data available to them. ‘Data constraints’ are about conducting an evaluation when critical information needed to address the evaluation questions is missing, difficult to collect or of poor quality.??
Examples CES regularly encounter include:??
领英推荐
Some of the strategies suggested by Bamberger and Mabry to reduce data constraints include:?
To this, we would also add, based on our experience:??
It is ?important to be transparent when describing your findings and conclusions. Data constraints will almost always weaken the certainty of your findings, even if the strategies to reduce data constraints are used. It can be easy, however, to fall into the trap of being overly confident in a set of findings even when data constraints do not support them. As evaluators, it’s important to be clear about the limitations of the evaluation and express an appropriate level of caution about the certainty of your findings. ?
CES Case study?
CES is currently completing an evaluation of the Better Start Quality Development Service (QDS) on behalf of the Department for Children, Equality, Disability, Integration and Youth (DCEDIY).??
One of the challenges for the evaluation is the difficulty in gathering baseline data via surveys from a large sample of services that were accessing the QDS service. This was unrealistic within the timeframe of the evaluation and was an important limitation because it is difficult to generalise findings from a very small sample.??
To supplement the survey of services accessing the service, the evaluation used a mixed-method approach by collecting data from a variety of sources (e.g. documents, interviews, observations) and triangulated the results across the different methods to strengthen the conclusions we could make.???
4. Political and Organisational Constraints?
Evaluations do not take place in a vacuum. Political and organisational constraints almost always are present in different ways and to different extents. Bamberger and Mabry use the term to refer “not only to pressures from government agencies and politicians but also to include the requirements of funding or regulatory agencies, pressures from stakeholders, and differences of opinion within an evaluation team regarding evaluation approaches or methods”.?
These are not necessarily bad things, though they can sometimes have a negative influence. They can also be seen as a positive indicator of the importance of an evaluation to stakeholders.??
Some political and organisational constraints include:?
Managing political and organisational constraints effectively requires good teamwork and communication. It requires the evaluator to demonstrate effective stakeholder management and at times, it will require a firm approach to protect the independence of an evaluation.
At the beginning of the evaluation, it can also help to clearly define evaluation boundaries and collaboratively conduct stakeholder analyses. Defining evaluation boundaries with the evaluation commissioners can help to clarify where the evaluation must retain absolute independence and where it can accommodate stakeholder input and preferences.??
A collaborative stakeholder analysis can also help to identify key stakeholders who might be unknown to the evaluator and understand their potential interests and influence on the evaluation.?
5. Ethical Considerations and Constraints?
Ethics are an essential part of our considerations and practice in every evaluation we do at CES. Ethics are about how we should, and do, conduct an evaluation. ?
As experienced researchers, all our evaluations follow certain ethical principles: ?
These are the minimum ethical standards we embed in every evaluation. When deciding which evaluation types, designs and methods are most appropriate for a particular evaluation, we have to be sure that the methods we use will adhere to these principles.??
When working on sensitive topics or with vulnerable populations, there may be additional guidelines to follow. For example, the ethical standards required when working with children and young people may be slightly higher than those for adults. Another example is evaluations of palliative care. Palliative care is an especially sensitive topic with high risk of harm to patients and their caregivers. Specific recommendations on how to conduct ethical palliative care evaluations were published in 2013 by the MORECare expert group.?
6. Knowledge and Skills Constraints?
At CES, we are fortunate to have a team with a wide range of evaluation knowledge and skills. However, there is such an abundance of evaluation types, designs, methods and approaches that no evaluator can be skilled in everything… So, a perhaps obvious question for evaluators to ask themselves when designing an evaluation is ‘what kind of evaluation do we actually have the knowledge and skills to conduct?’. ?
?