The increased sophistication of AI systems has enabled an entirely new way of not accepting responsibility for one’s actions. One can say that one was a victim of a malicious AI attack that mimicked you either in video or voice and proving otherwise would be hard.
But there is another kind of excuse that is evidenced in this case.
Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.
Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.
…In 2022, Jake Moffatt contacted Air Canada to determine which documents were needed to qualify for a bereavement fare, and if refunds could be granted retroactively.
According to Moffat’s screenshot of a conversation with the chatbot, the British Columbia resident was told he could apply for the refund “within 90 days of the date your ticket was issued” by completing an online form.
Moffatt then booked tickets to and from Toronto to attend the funeral of a family member. But when he applied for a refund, Air Canada said bereavement rates did not apply to completed travel and pointed to the bereavement section of the company’s website.