Five tips to get the most from your in-house evaluation

Morris Hargreaves McIntyre
4 min readJan 19, 2021

Lorna Dennison, senior consultant, writes:

Rail travellers sit on a bench with We’re Here Because We’re Here participant dressed as WW1 soldier.
Ghost soldier from ‘We’re Here Because We’re Here’: a modern memorial to commemorate the centenary of the Battle of the Somme, performed on 1 July 2016 throughout the UK, which MHM evaluated.

This year cultural organizations have been forced to do things differently, to rethink their relationships with their audiences and challenge their assumptions of what’s important to them.

In response, we’ve seen an explosion of creativity and experimentation from the performing arts, museums and art gallery exhibitions and others. As a sector we’ve rapidly evolved from dumping collections online to creating bespoke cultural offers that embrace the restrictions forced upon us.

But how to tell if your hard work is paying off — and why (or why not) your creative experiments worked? How can arts organisations learn, build and improve?

Evaluation is sometimes viewed as a box-ticking exercise but, if approached in the right way, it can be rocket fuel for future improvement.

At Morris Hargreaves McIntyre (MHM) we’ve seen a big increase in evaluation briefs but not every idea needs, or has the budget for, a full-scale third-party evaluation project.

If your organization is planning to evaluate your creative projects in-house, here are MHM’s top tips to get the most out of your efforts:

1. Starting thinking about evaluation early

If possible, build it in from the start of the project and consider when you’ll need the findings. Evaluation doesn’t just happen at the end — it should be throughout a project so you can learn, adjust and improve as you go.

That’s not to say it’s impossible to evaluate after the fact — it happens often (and we’ve certainly done it a few times when that was the only option). But it is always easier, better — and usually cheaper — if you’ve planned it from the beginning.

2. Start by asking — what are you trying to achieve?

Evaluation is about measuring and understanding success and challenges. So the starting point has to be your vision: what was the point of it all, what will success look like? From there, keep drilling down into something you can measure. What useful, actionable information will really shape what you do in future?

Try to look beyond merely recording what happened and instead consider how you can apply the evaluation findings. Did you prove your hypothesis? How can you improve your practice?

But, also, remember to make space for unintended outcomes: allow an opportunity for people to talk about experiences that you might not have anticipated.

3. Take an audience-focused approach

Once you know what you’re trying to find out, the next question is: who do you need to consult?

You must understand your audience’s needs when planning your methods. Make sure there is a diverse range of voices in the early stages of design, provide alternatives for access needs, and don’t assume digital is the preference (or possible!) for everyone.

In some cases, there is also a duty for co-creation — ensuring evaluation is done with your stakeholders, not to them. For example, for a programme that involves young people as invested participants, they should have a voice in how the evaluation is delivered.

Top tip: balance the ask of the evaluation with how the respondents have been involved. It’s fair to expect participants in a year-long program to give you 45 minutes of their time to feed back. For one-off event audiences, the ask should be much less.

4. Mix your methods to cover a range of outcomes, efficiently

There is no best practice for evaluation in terms of the methods you use. The tools for your evaluation will depend on many things, including what you need to know and who you needs to ask.

But there are many valuable guides and frameworks (we use a strategy tree and method matrix, but there is also theory of change and others). Do some research to find the best one for you.

Mixing qualitative and quantitative methods is almost always vital (even if it’s just including some open questions in a survey) to ensure you have both robust and rich data.

Think about what options you already have for data collection and piggy-back where you can: adding a couple of questions to an existing survey, sign-up form or discussion will save you effort and duplication.

5. Push for objectivity, challenge your assumptions

It’s natural to be very invested in any project you believe in and have worked hard to deliver. That can make objective evaluation tough.

The ideal is full independence — an external evaluator (like us) who’ll consider the big picture from an unbiased perspective and draw clear, insight-based conclusions. If that’s not possible, keep notes throughout of your own thoughts and assumptions — write them down early so they don’t influence you later without you realising.

Appoint someone more distant as a critical friend to ask the tough questions and challenge your potential bias. Look at your data with a critical eye — take care not to infer what isn’t here (particular with small sample sizes).

Good luck!

Morris Hargreaves McIntyre is an international strategy and insight consultancy. We work with charities, cultural and heritage organisations of all sizes.

Join our mailing list to hear about free webinars, case studies and sector insights.

www.mhminsight.com

--

--

Morris Hargreaves McIntyre

Strategy & insight consultants for charities, culture and heritage organizations of all sizes. Vision-led, insight-driven, audience-focused.