The chatbot that got an airline sued

type
Article
author
By Dentons Partner Campbell Featherstone and Senior Associate Gunes Haksever
date
5 Mar 2024
read time
4 min to read
The chatbot that got an airline sued

Can companies blame chatbots for their misrepresentations to consumers? Can chatbots be liable for what they say? A Canadian civil-resolutions tribunal says no.

The British Columbia Civil Resolution Tribunal decided against Air Canada after its chatbot misled a consumer on bereavement fares, despite the airline’s rather optimistic argument that the chatbot was “responsible for its own actions”.

The customer was awarded damages equivalent to the difference between what he paid for the flights, and the discounted bereavement fare.

The case shows, at least in Canada, that the duty to make sure that representations made to consumers are accurate can extend to the representations made by chatbots on behalf of a business, and that these representations are no less important than representations made on any other page on a business’ website.

Given how commonly chatbots are being used nowadays, businesses should make sure that their chatbots are trained on the most accurate and updated information and stress test their chatbots to make sure the answers given by the chatbot accurately reflect their policies and offerings. 

The consumer, Mr Moffatt, consulted with the chatbot on Air Canada’s website while booking a flight to Toronto after the death of his grandmother. The chatbot suggested that bereavement fares can be claimed retrospectively, as follows:

“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family. If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”

Relying on this information, Mr Moffatt took a screenshot of the chatbot’s response, and booked two flights.

After two-and-a-half months of failed attempts to get a partial refund, Mr Moffatt emailed the airline and with the screenshot from the chatbot. In response, an airline representative stated that the chatbot had provided “misleading words” and pointed out the chatbot’s link to Air Canada’s bereavement policy which states: “Please be aware that our bereavement policy does not allow refunds for travel that has already happened.” 

Who is responsible?

The crux of the case was whether a link to the airlines “Bereavement Travel Policy” on the airline’s website should have priority over the chatbot’s response, and whether the airline is responsible for the representations made by its chatbot, as if it were a statement on their website. 

Air Canada argued that it “cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.” In effect, the airline suggested that the chatbot is a separate legal entity that is responsible for its own actions.

This argument was rejected by the Tribunal: “While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Given its commercial relationship as a service provider and consumer, the Tribunal found that Air Canada owed Mr Moffatt a duty of care, and that Air Canada did not take reasonable care to ensure its chatbot was accurate. Under this reasoning, the Tribunal rejected the airline’s argument that Mr. Moffatt could find the correct information on another part of its website, which the chatbot provided the link for. The Tribunal member accepted Mr Moffatt’s claim that he had relied upon the chatbot to provide accurate information and found that this was reasonable in the circumstances.

The Tribunal notedthat “[t]here was no reason that a webpage titled “Bereavement travel” was inherently more trustworthy than its chatbot”, and that the airline failed to explain why customers should have to double check information found in one part of its website with information on another part of its website. 

In the end, Mr Moffatt was awarded CA$650.88 in damages for negligent misrepresentation. In addition, the airline was ordered to pay CA$36.14 in pre-judgment interest and CA$125 in Tribunal fees. 

Takeaways

This case re-affirms common sense, that an organisation (albeit, in Canada) can be held liable for the representations (and misrepresentations) its chatbots make. This case is helpful in showing us that a business’ duty of care extends to its chatbots’ representations, and further highlights that, while AI is beneficial, AI has its risks.

Businesses should consider the following key takeaways if they are using, or planning to use, chatbots within their customer experience journeys: 

  • A duty of care can extend to training and ensuring chatbots are accurate, and not misleading. Due to their commercial relationship, Air Canada owed a duty of care to Mr Moffatt and it made the representation negligently, which Mr Moffatt relied on. 
  • Businesses can be held liable for their chatbots’ misrepresentations in the same way as a misrepresentation on their landing page. It doesn’t matter where representations are being made – the tribunal was indifferent about whether the information came from a static page or the chatbot. 
  • The AI “hallucination phenomenon” can present considerable risks depending on how much external-facing AI tools are being relied on. Large language models, such as generative AI chatbots, are trained on data and they learn to make predictions by finding patterns in data. However, if training data is biased, incomplete or of bad quality, this may cause the AI model to learn incorrect patterns and therefore make incorrect predictions or “hallucinate”. Therefore, it is important businesses ensure that their chatbots are regularly updated and tested so as to ensure the information they provide is accurate and consistent with company policies.
  • Chatbot developers are unlikely to guarantee that the responses of chatbots will be 100% accurate and will likely exclude any liability for the answers given by chatbots in their terms of use. Businesses should check the contractual terms under which they obtain their chatbot and ensure that they are comfortable with the remedies available to them for inaccurate answers by their chatbot, or where the chatbot goes rogue.