Collecting feedback is the easy bit. Using it to make a difference is harder
It’s all very well collecting customer feedback but unless it can be put to good use, it’s something of a pointless exercise.
Compared to other industries, most training companies collect huge quantities of feedback. For them, the challenge has always been not so much how to collect feedback, but what to do with it all! The key thing is to start with the end in mind. If you're serious about continuous improvement, then you need to start by deciding how you're going to measure your success. This could be a simple average star rating for your courses but a more useful measure is Net Promoter Score (NPS).
Net Promoter Score (NPS)
If you're unfamiliar with NPS, it's an established method of analysing responses to a question that's often found on feedback forms: "On a scale of 0-10, how likely are you to recommend us?". The way it works is that scores of six or less are treated as negative. These are you detractors. Scores of 9 or 10 are your fans; and scores of 7 or 8 are neutral. NPS is calculated by taking your overall percentage of fans and subtracting the overall percentage of detractors. Neutral scores are ignored. So your NPS can vary between +100 and -100 and the rule of thumb is that you want your NPS score to be positive, meaning that you have more fans than detractors. For more about what NPS score you should be looking to achieve, see here.
Set targets and report against them
Whatever metric you choose to use, if you want people to care about quality, then you need to make a lot of noise about it within your company. And do so on a regular basis. There's nothing worse than a quality initiative that fizzles out. If that happens, there's a real danger that people take a cynical view that it was just a passing fad and that quality actually drops as a result. Keeping a quality initiative going can be hard work which is why it's best not to be too ambitious to start with - you can always expand it later.
Lead and Lag indicators
Although NPS is a good way to report on the quality of the service you're providing, if it's low or starts to fall, it tells you nothing about what the underlying problem is, let alone what to do about it. And because whatever has caused the drop has already happened, it's too late to do anything about it. For this reason, NPS is referred to as a Lag indicator.
By contrast, Lead indicators are about measuring underlying behaviours exhibited by your team that you believe will lead to a good outcome. For example, you might take the view that if your instructors were to spend time, sitting in on each others courses, they would be able to swop tips and learn from each other; and that the result would be more satisified customers. This would be a Lead indicator. The way to test the theory would be to introduce the new behavious, measure how much time was being spent in this way and then see whether this led to a higher NPS in the following weeks and months. If it made a difference then you could encourage more of the same; and if it made no difference you could consider what else might make a difference and track in the same way.
Share your results
Having set targets and measured results against them, you now need to share your results. There's evidence to support the notion that even if you do nothing else, simply publishing your performance actually causes positive behavioural changes. But regardless of that, being transparent about performance and sharing it widely whether good or bad, is something to be encouraged. It allows you to learn from mistakes, celebrate success and get everyone aligned on initiatives to improve even more.
Make it matter
The ultimate way to make feedback matter is to introduce a carrot and stick approach to get people to behave in the way that you think will be most beneficial for the business. One Coursecheck customer goes as far as bonusing every member of staff - not just the trainers - on the NPS score for the month. And guess what? Everyone is extremely interested in what that score is, and surprise, surprise, it's invariably very good indeed.