(no title)
quartz | 2 years ago
This is very reasonable-- AI or not, companies can't expect consumers to know which parts of their digital experience are accurate and which aren't.
quartz | 2 years ago
This is very reasonable-- AI or not, companies can't expect consumers to know which parts of their digital experience are accurate and which aren't.
BugsJustFindMe|2 years ago
> Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives
That includes EMPLOYEES. So they tried to argue that their employees can lie to your face to get you to buy a ticket under false pretense and then refuse to honor the promised terms? That's absolutely fucked.
RajT88|2 years ago
I once booked a flight to meet my then-fiancee in Florida on vacation. Work travel came up unexpectedly, and I booked my work travel from ORD > SFO > TPA.
Before I made that booking, I called the airline specifically to ask them if skipping the ORD > TPA leg of my personal travel was going to cause me problems. The agent confirmed, twice that it would not. This was a lie.
Buried in the booking terms is language meant to discourage gaming the system by booking travel where you skip certain legs. So if you skip a leg of your booking, the whole thing is invalidated. It's not suuuuper clear, I had to read it a few times, but I guess it kinda said that.
Anyways - my itinerary was invalidated by skipping the first flight, and I got lucky enough that someone canceled at the last minute and I could buy my own seat back on the now-full flight for 4x the original ticket price I paid (which was not refunded!).
I followed up to try and get to the bottom of it, but they were insistent they had no record of my call prior, and just fell back on "It's in the terms, and I do not know why you were told wrong information". Very painful lesson to learn.
I try and make a habit of recording phone conversations with agents now, if legal in where I'm physically located at the time.
mschuster91|2 years ago
Pretty standard behavior for big companies. Airlines and telcos are the utter worst... you have agent A on the phone on Monday, who promises X to be done by Wednesday. Thursday, you call again, get agent B, who says he doesn't see anything, not even a call log, from you, but of course he apologizes and it will be done by Friday. (Experienced customers of telcos will know that the drama will unfold that way for months... until you're fed up and involve lawyers)
sokoloff|2 years ago
unknown|2 years ago
[deleted]
ysavir|2 years ago
It's the degree of misinformation that's relevant.
FergusArgyll|2 years ago
wing-_-nuts|2 years ago
I wrote back that unless they issused a refund, I would issue a charge back. You don't get to present the customer with one thing and then do otherwise because you say so on a page the customer has never read when ordering.
They eventually caved, but man, the nerve.
dataflow|2 years ago
E.g., did they tell you the shipping date after you placed the order, or before? If it was afterward, then it can't have invalidated the contract... you agreed to it without knowing when it would ship. If they told you before, then was it before they knew your shipping address, or after? If it was beforehand, then again, it should've been clear that they wouldn't be able to guarantee it without knowing the address. If it was after they got the address but before you placed the order, then that makes for a strong case, since it was specific to your order and what you agreed to before placing it.
bluGill|2 years ago
If the company doesn't agree to that, then they need to show the employee was trained on company policy and was disciplined (on first offense maybe just a warning, but this needs to be a clear step on the path to firing the employee) for failing to follow it. Even then they should stand by their employee if the thing said was reasonable (refund you $million may be unreasonable, but refund purchase price is reasonable)
wredue|2 years ago
It’s not the consumers fault that the AI hallucinated a result (as they are known to do with high frequency).
manicennui|2 years ago
grotorea|2 years ago
Real legal comedy. Since this was in small claims court maybe it was an amateur on Air Canada's side?
lenerdenator|2 years ago
Same with chatbots. Even better, because once it's "trained", you don't have to pay it.
There's a few instances of expecting digital entities to shoulder the entirety of legal liability here in the last few years; DAOs are another example of this in the crypto space.