OK, this is pretty funny, but it also shows why you need to read the article, not just the headline. The policy actually exists; the chatbot simply found a way through the menu tree as a shortcut for the person, and in so doing enabled the person to request the refund after the flight, instead of before, as Air Canada originally intended. Air Canada's argument (clearly stated in the article) is that "Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives—including a chatbot." This is a denial of responsibility on a broad scale, having nothing in particular to do with an AI. Broadly overstating the case against AI does nothing for the case against AI, though the case does stand as a warning for organizations intending to communicate policies and procedures to the public.
Today: 2 Total: 101 [Share]
] [