top | item 39455363

(no title)

quartz | 2 years ago

> "Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives—including a chatbot," Rivers wrote. "It does not explain why it believes that is the case" or "why the webpage titled 'Bereavement travel' was inherently more trustworthy than its chatbot."

This is very reasonable-- AI or not, companies can't expect consumers to know which parts of their digital experience are accurate and which aren't.

discuss

order

BugsJustFindMe|2 years ago

Forget about digital experiences for a moment. Forget entirely about chatbots.

> Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives

That includes EMPLOYEES. So they tried to argue that their employees can lie to your face to get you to buy a ticket under false pretense and then refuse to honor the promised terms? That's absolutely fucked.

RajT88|2 years ago

I have had this happen in a sense.

I once booked a flight to meet my then-fiancee in Florida on vacation. Work travel came up unexpectedly, and I booked my work travel from ORD > SFO > TPA.

Before I made that booking, I called the airline specifically to ask them if skipping the ORD > TPA leg of my personal travel was going to cause me problems. The agent confirmed, twice that it would not. This was a lie.

Buried in the booking terms is language meant to discourage gaming the system by booking travel where you skip certain legs. So if you skip a leg of your booking, the whole thing is invalidated. It's not suuuuper clear, I had to read it a few times, but I guess it kinda said that.

Anyways - my itinerary was invalidated by skipping the first flight, and I got lucky enough that someone canceled at the last minute and I could buy my own seat back on the now-full flight for 4x the original ticket price I paid (which was not refunded!).

I followed up to try and get to the bottom of it, but they were insistent they had no record of my call prior, and just fell back on "It's in the terms, and I do not know why you were told wrong information". Very painful lesson to learn.

I try and make a habit of recording phone conversations with agents now, if legal in where I'm physically located at the time.

mschuster91|2 years ago

> They tried to argue that their employees can lie to your face to get you to buy a ticket under false pretense and then refuse to honor the promised terms. That's fucked.

Pretty standard behavior for big companies. Airlines and telcos are the utter worst... you have agent A on the phone on Monday, who promises X to be done by Wednesday. Thursday, you call again, get agent B, who says he doesn't see anything, not even a call log, from you, but of course he apologizes and it will be done by Friday. (Experienced customers of telcos will know that the drama will unfold that way for months... until you're fed up and involve lawyers)

sokoloff|2 years ago

Fortunately, in this case, the consumer had enough proof of what happened and the court rightly told Air Canada to get fucked with that argument.

ysavir|2 years ago

I can sort of see it. On the one hand, it's reasonable to hold them accountable when an employee gives you the wrong discount. But if an employee, on their last day at work, decides to offer the next person calling all of the seats on a single flight for just $10, I think we'd all agree that it would be unreasonable to expect the airline to honor that offer.

It's the degree of misinformation that's relevant.

wing-_-nuts|2 years ago

I once ordered a gift for my father for christmass. The order page indicated that it would arrive on time. When it didn't arrive, I requested a refund. They then pointed to their FAQ page where they said that orders during the holidays would incur extra processing time, and refused the refund.

I wrote back that unless they issused a refund, I would issue a charge back. You don't get to present the customer with one thing and then do otherwise because you say so on a page the customer has never read when ordering.

They eventually caved, but man, the nerve.

dataflow|2 years ago

This actually sounds like an interesting case to me because the details make a huge legal difference in my mind. (But IANAL, maybe I'm entirely off base here.)

E.g., did they tell you the shipping date after you placed the order, or before? If it was afterward, then it can't have invalidated the contract... you agreed to it without knowing when it would ship. If they told you before, then was it before they knew your shipping address, or after? If it was beforehand, then again, it should've been clear that they wouldn't be able to guarantee it without knowing the address. If it was after they got the address but before you placed the order, then that makes for a strong case, since it was specific to your order and what you agreed to before placing it.

bluGill|2 years ago

I expect employees to know the correct answers and give them to me. When an employee says something that contradicts other policy pages I'm expecting that to be a change to company policy to me - they represent the company.

If the company doesn't agree to that, then they need to show the employee was trained on company policy and was disciplined (on first offense maybe just a warning, but this needs to be a clear step on the path to firing the employee) for failing to follow it. Even then they should stand by their employee if the thing said was reasonable (refund you $million may be unreasonable, but refund purchase price is reasonable)

wredue|2 years ago

This is especially true since, as it comes to refund policies, businesses make it exceedingly difficult to sort through the information.

It’s not the consumers fault that the AI hallucinated a result (as they are known to do with high frequency).

manicennui|2 years ago

This line of argument is crazy and infuriating. "Air Canada essentially argued, 'the chatbot is a separate legal entity that is responsible for its own actions,' a court order said." Do they expect people to sue the chatbot? Are they also implying that people have to sue individual agents if they cause a problem?

grotorea|2 years ago

> 27. Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot. > https://www.canlii.org/en/bc/bccrt/doc/2024/2024bccrt149/202...

Real legal comedy. Since this was in small claims court maybe it was an amateur on Air Canada's side?

lenerdenator|2 years ago

If they could reasonably expect to be able to hire people who would agree to accept all liability incurred during their work for the company, they absolutely would.

Same with chatbots. Even better, because once it's "trained", you don't have to pay it.

There's a few instances of expecting digital entities to shoulder the entirety of legal liability here in the last few years; DAOs are another example of this in the crypto space.