This is something that I suspect that we’ll see more of in the coming months and years. CTV News is reporting that a chatbot that Air Canada uses handed out incorrect information to a man in regards to bereavement rates:
Jake Moffatt was booking a flight to Toronto and asked the bot about the airline’s bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member.
Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.
He submitted his request, accompanied by his grandmother’s death certificate, in November of 2022 – less than a week after he purchased his ticket. But his application was denied and the tribunal decision said emails submitted as evidence showed that Moffatt’s attempts to receive a partial refund continued for another two-and-a-half months.
The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.
In February of 2023, Moffatt sent the airline a screenshot of his conversation with the chatbot and received a response in which Air Canada “admitted the chatbot had provided ‘misleading words.'”
But Moffatt was still unable to get a partial refund, prompting him to file the claim with the tribunal.
Air Canada for its’ part said that the company could not be held responsible for what the chatbot said because the chatbot is a separate entity from Air Canada.
Yeah. They really said that. Here’s how that went down:
Air Canada, for its part, argued that it could not be held liable for information provided by the bot.
“In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website,” [tribunal member Christopher C. ] Rivers wrote.
“It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
The airline also argued that the chatbot’s response to Moffatt’s inquiry included a link to a section of its website that outlined the company’s policy and said that requests for a discounted fare are not allowed after someone has travelled.
Rivers rejected this argument as well.
Air Canada has been ordered to pay $650.88 in damages. In addition, the airline was ordered to pay $36.14 in pre-judgment interest and $125 in fees.
Now Air Canada’s argument is at best laughable, and at worst a desperate attempt to cover up the fact that their chatbot wasn’t properly set up to deliver accurate information 100% of the time. And while the story doesn’t say this, I suspect that the reason he went the chatbot route is that it is nearly impossible to get an actual human being on the phone over at Air Canada. At least, that’s been my experience over the last few years when I’ve needed to call them. Perhaps Air Canada should invest not in chatbots, but actual human beings that are properly trained and properly equipped to help customers 100% of the time and quickly? Just a thought.
Air Canada Tried To Dodge Responsibility For It’s Chatbot Handing Out Incorrect Information… And Fails
Posted in Commentary with tags Air Canada on February 15, 2024 by itnerdThis is something that I suspect that we’ll see more of in the coming months and years. CTV News is reporting that a chatbot that Air Canada uses handed out incorrect information to a man in regards to bereavement rates:
Jake Moffatt was booking a flight to Toronto and asked the bot about the airline’s bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member.
Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.
He submitted his request, accompanied by his grandmother’s death certificate, in November of 2022 – less than a week after he purchased his ticket. But his application was denied and the tribunal decision said emails submitted as evidence showed that Moffatt’s attempts to receive a partial refund continued for another two-and-a-half months.
The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.
In February of 2023, Moffatt sent the airline a screenshot of his conversation with the chatbot and received a response in which Air Canada “admitted the chatbot had provided ‘misleading words.'”
But Moffatt was still unable to get a partial refund, prompting him to file the claim with the tribunal.
Air Canada for its’ part said that the company could not be held responsible for what the chatbot said because the chatbot is a separate entity from Air Canada.
Yeah. They really said that. Here’s how that went down:
Air Canada, for its part, argued that it could not be held liable for information provided by the bot.
“In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website,” [tribunal member Christopher C. ] Rivers wrote.
“It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
The airline also argued that the chatbot’s response to Moffatt’s inquiry included a link to a section of its website that outlined the company’s policy and said that requests for a discounted fare are not allowed after someone has travelled.
Rivers rejected this argument as well.
Air Canada has been ordered to pay $650.88 in damages. In addition, the airline was ordered to pay $36.14 in pre-judgment interest and $125 in fees.
Now Air Canada’s argument is at best laughable, and at worst a desperate attempt to cover up the fact that their chatbot wasn’t properly set up to deliver accurate information 100% of the time. And while the story doesn’t say this, I suspect that the reason he went the chatbot route is that it is nearly impossible to get an actual human being on the phone over at Air Canada. At least, that’s been my experience over the last few years when I’ve needed to call them. Perhaps Air Canada should invest not in chatbots, but actual human beings that are properly trained and properly equipped to help customers 100% of the time and quickly? Just a thought.
Leave a comment »