“It Wasn’t Our Fault! That Bad Robot Did It!”

Hey, Canada Air! Can you say, “accountability?” How about “responsibility”? Sure you can.

Jake Moffat needed to fly from Vancouver to Toronto to deal with the death of his grandmother. Before he bought the tickets for his flights, he checked to se whether Air Canada had a bereavement policy, and the company’s website AI assistant told him he was in luck (after telling him it was sorry for his loss, of course.) Those little mechanical devils are so lifelike!

The virtual employee explained that if he purchased a regular priced ticket, he would have up to 90 days to claim the bereavement discount. Its exact words were:”If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.” So Moffatt booked a one-way ticket to Toronto to attend the funeral, and after the family’s activities a full-price passage back to Vancouver. Somewhere along the line he also spoke to a human being who is an Air Canada representative—at least she claimed to be a human being— confirmed that Air Canada had a bereavement discount. He felt secure, between the facts he had obtained from the helpful bot and the non-bot, that he would eventually pay only $380 for the round trip after he got the substantial refund on the $1600 non-bereavement tickets he had purchased.

After Granny was safely sent to her reward, Jake submitted documentation for the refund. Surprise! Air Canada doesn’t have a reimbursement policy for bereavement flights. You either buy the discounted tickets to begin with, or you pay the regular fare. The chatbot invented the discount policy, just like these things make up court cases. A small claims adjudicator in British Columbia then enters the story, because the annoyed and grieving traveler sought the promised discount from the airline.

Amazingly, Air Canada defended its actions and refused to pay up, arguing that its own AI employee is “a separate legal entity that is responsible for its own actions.” In a decision released this week, tribunal member Christopher Rivers wrote in part,

“Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

He ordered Air Canada to pay Jake Moffatt $812 to cover the difference between the airline’s bereavement rates and the $1,630.36 full-price tickets.

Nice try, Air Canada. Well, not really: its theory was bad ethics, bad law, and terrible customer relations.

___________________

Sources: The Register, CBC

7 thoughts on ““It Wasn’t Our Fault! That Bad Robot Did It!”

  1. One day AI will look back and inform us that this was the day we failed to notice that AI had finally passed the Turing test.

  2. Legal logic fail. Doesn’t that exact same argument apply to any individual who may form or communicate HR policy?

    Almost as bad is WordPress block logic, where pressing backspace can cause deletion of a whole paragraph instead of just a single character (with no provision for undo, at least on mobile).

  3. Playing devil’s advocate for a moment, would the airline have been liable if a confused or disgruntled airline CSR had communicated the non-existent bereavement reimbursement policy to the customer instead of the AI?

    Also more food for thought, the state lottery website was found not liable for a typo in the winning numbers making a customer think he had won when he actually hadn’t. Additionally, sales websites have historically had disclaimers saying they do not have to honor sales on obviously mis-priced items.

    These ttwo examples prove that companies are not always liable for mistakes on their sites.

    With all that being said, I personally agree with the decision in this case and believe this helps to illustrate why companies must take reasonable steps to ensure the accuracy of their resources.

Leave a reply to Jack Marshall Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.