No more embarrassing customer service emails!
It’s often this aim that makes companies set up Quality Monitoring or Quality Assessment for their customer service teams.
Depending on company size, you might have a manager reading literally everything before it goes out to the customer. Or you might have sophisticated speech analytics that pre-select a number of sent messages for scoring by a quality team.
No matter the setup though, many companies struggle to determine the right criteria for assessing contact quality. As a result, some advisors feel ‘policed’ by the quality team — and complain that their actions are judged based on a random set of rules.
- SPAG error ratio
1. SPAG error ratio
While perfection in Spelling, Punctuation and Grammar isn’t an end in itself, it supports your team’s image of competence and credibility. And according to the BBC, spelling mistakes cost the UK millions in lost sales.
But how best to measure it? We’ve seen many quality forms that fail anyone with more than three typos. That’s not fair to dyslexic people or those who write long letters — or anyone (like Sabine) who struggles with the ergonomics of most keyboards. What’s more, many team leaders and quality coaches tell us they don’t even feel equipped to judge SPAG properly.
That’s not surprising: There was a time when grammar lessons were unpopular with curriculum makers. Some people may not have enjoyed the lessons they had. Or they learned things that no longer fit with the language used in 21st-century newspapers, books and online.
Take inspiration from school
With that in mind, it may seem counterintuitive for us to suggest we look to school essays for scoring SPAG. But today’s language teaching takes a scientific approach, and that’s an excellent starting point:
- Usually, it’s OK to use American, Irish, South African spelling and grammar, etc. — as long as it’s consistent.
- If an error doesn’t affect the message, it counts for half an error point. Otherwise, it’s one point per expression.
- Those error points are multiplied by 100 and divided by the number of words to account for longer and shorter pieces of writing. Targets are set around an error ratio of up to 5%. More than that, and it’s a fail (coaching may already kick in at a lower ratio).
- Anything that sounds perfectly normal to a native speaker is deemed correct, no matter if a grammar book from the 1960s would insist it isn’t ‘the Queen’s English.’
This makes SPAG much less scary — and brings it in line with current linguistics as taught at university and in school text books.
Whether someone’s able to get their ideas across is usually seen as a subjective judgement.
Yet we all do it: during the hiring process, when we vote in an election, or when we choose to read something. Scoring customer support writing isn’t about our personal feelings and ideas though, so QA needs to be based on proven ways to get your customers to do, think, feel or buy something.
Explain those criteria to your advisors and practise them in training — they’re all learnable:
- A fluid, natural style without “customer service lingo” such as valued customer, do not hesitate to contact us, should you require further information, I apologise for the inconvenience caused.
- The language fits your brand’s personality. This usually means no terms of abuse and swearing, no cheap colloquialisms, not too many ’empty’ words like cool, mega, nice, etc. Depending on your brand, you might ask advisors to use formal language, Black English, slang, dialect words, contractions or the occasional ‘sloppy’ expression such as ‘coz for because. Advisors need to know about those expectations before they’re scored in this way.
- Neutral language for serious queries. Reserve your kookiest language for the lightest of topics. Sometimes this is called ‘dialling your volume up and down’ or ‘flexing your tone of voice’. It helps the customer to feel you take their matter as seriously as they do.
- The message always needs to be crystal clear.
Content should take centre stage. It’s the most important factor in customer happiness and sustainable business practice:
- Is this email, letter, call, chat, etc. in line with the Data Protection Act and relevant industry regulations, such as PCI Compliance?
- Are all the facts correct? If not — could the advisor have known any better?
- Did the advisor keep their promises?
- Did they respond adequately to all points the customer made?
- Is this response likely to solve the issue without the customer needing to get in touch again? Has the advisor preempted follow-up questions the customer will likely have later on?
- Most importantly, does this response have the customer’s needs at the heart of the message?
After all, it doesn’t matter how beautifully the letter is written if it’s full of “fake news”. And most of the service communication that’s gone viral had either amazing content (such as the Sainsbury’s Giraffe Bread example) or embarrassingly bad content (like this Ocean Marketing example from 2011.)