Go deeper - join us!
Whatever happened to training?
As much as it is impossible to account for every single situation, an easily accessible training manual paired with a designated person to come to with questions is essential to keeping things consistently on track—even for some senior positions. And yet it’s getting trendier to insist that time spent training newcomers isn’t worthwhile.
Short-sighted business owners, uncomfortable with the idea of making slightly less money during a training period, or perhaps resentful of the idea that the effort could go to someone who decides not to stay with the company until death or layoffs do them part, rail against training time as a waste of resources.Â
And now it looks like even jobs taken over by AI have been tarred with that same brush. It’s not unusual for customers to catch things like incorrect phone number prompts when no one bothers to make sure updates got integrated, or to beta test anything. Typically it goes ignored for…ever. But sometimes we have to pay for our cyber-employee mistakes, as Air Canada recently found out.
Ashley Belanger of Ars Technica reported on a Mr. Jake Moffat’s dealings with the airline regarding reduced fees for bereavement flights. The grieving Mr. Moffat interacted with the AI chatbot assistant on their website to see how best to make use of the standard reduced ticket price for flights regarding family deaths.Â
Thus spake the bot (emphasis mine):
If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.
Unfortunately, as well we all should know, generative AI can ‘hallucinate’ information—that is, it can generate totally false information because it compiles answers based on predictive text, rather than the amassment of actual knowledge.Â
The actual policy states that refunds for bereavement flights will not be honored after the flight is booked.
Under normal circumstances, Mr. Moffat would have been able to show a screenshot to someone who would run the need for more AI script refinement up the chain, and also grant him the refund as a grieving customer who followed the wrong info through no fault of his own.
Air Canada chose to take a left though.
Rather than doing the right thing, so obvious even a non-management pleb like me could figure it out, Air Canada chose to fight.
“According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.
Fortunately, the small claims court found that just as guanoriffic and ridiculous as we all should (though the judge chose to say “remarkable” instead), and Mr. Moffat got his refund, as well as some expenses and court fees back.
“Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives—including a chatbot,” and “does not explain why customers should have to double-check information found in one part of its website on another part of its website,” said Judge Rivers.Â
His case aside though, you definitely have to get concerned about the mentality behind throwing customers under the bus due to less than stringent training being applied to the AI businesses are clamoring for.Â
Do you feel confident in looking your customers and a court of law in the face and saying ‘We can’t be held responsible for laying off human beings to save money, then replacing them with something subpar?’
I’ve made my own feelings on the cybersourcing clear, but even if you’re a fan of what generative AI can do for you, hopefully you can see the need for owning up to your technical difficulties, and training the input from the getgo.
PS about that training…you do need a human to do it. And that person probably won’t be the person who wrote and recorded all your SOPs to start with.
So get used to implementing better training and the human touch as far as customer service, whether your employees are flesh and blood or 0s and 1s. Because when there’s a mistake, ‘the dogbot hallucinated my homework’ won’t cut it.




