top | item 40502170

(no title)

ahzhou | 1 year ago

While it may not happen for you, “too lazy to look it up” is the vast majority of CS requests.

My understanding from talking to a couple of CS execs is that these have been a slam dunk in terms of ROI because CS agents don’t need to handle type C requests. I expect we’ll only see more as time goes on.

discuss

order

awofford|1 year ago

My guess is the ROI is provided by people giving up before they actually get help from a human.

jacobr1|1 year ago

I've analyzed support ticket requests before, and that doesn't seem to be the case. At least for the two times I've done this: 1) IT support tickets for a local school, and 2) Tickets for a B2B SaaS app. In both cases the majority of tickets where for things that seemed to me to be obvious. That if the user just bothered to spend 10 seconds looking they would figure it out. But they didn't. Some training helped on the IT side, and some UX improvements helped in SaaS app, but the bar is _sooo_ much lower than many expect.

bgirard|1 year ago

I wonder that too. If you're only measure one part of the funnel (e.g. CS costs) and not the total funnel (e.g. losses due to poor CS quality like a customer dropping the project) then it's easy to conclude that making CS more painful is a win.

freedomben|1 year ago

I don't doubt you, but if that's the case why not make it easy to get to a human? I'm fine explaining my problem to a robot, but if (when) they don't understand what I'm saying, hand me off to a human! For example, it's maddening to call the pharmacy and go through something like this:

    Pharmacy Robot:  Hello, thanks for calling <pharmacy>.  What can I do for you?  You can say anything like, "Check pharmacy hours" or "order a refill".

    Me:  Hi, I have a refill for <specific medication with rules around it> that is due next week but I'll be traveling out of the country to <other country> for a couple of weeks.  I need to know what my options are.

    Pharmacy Robot:  Ok, you want a refill.  Please enter the prescription number now.

    Me:  No, if we try to refill it, the automated system will just reject it.  I need to talk to a h...<cut off by robot>
Pharmacy Robot: Sorry, I didn't get that number. Using your phone's keypad, enter the number of your prescription refill.

Me: Jesus Christ, do I have to hang up and go through this whole thing ag... <cut off by robot>

Pharmacy Robot: Sorry, I didn't get that number. Using...<cut off by human hanging up>

That's just the most recent one I had. There are often better examples of madness...

mike_hearn|1 year ago

Because unless the chatbot is both better than a human in every way, and everyone knows that, the first thing people will do is push the button to reach the human. Why wouldn't they? They're calling in the first place because they don't want to make an effort to use the available tools to answer their question. They want a human.

ahzhou|1 year ago

To be fair, LLM-based chatbots are much better about this because you don't need to discover the magic incantation to talk to a human. It's a trade-off because that same property introduces the possibility of hallucination.

malfist|1 year ago

They're especially slam dunks when they don't provide you with a way to get out of the automated useless system. Looking at you Amazon

thaumasiotes|1 year ago

You can get out of the automated useless system. They don't make it easy.

But I once managed to get through to an actual agent with this question:

1. I want to buy a kindle version of this book [amazon link, for the paper version of the book].

2. On the page for the book, there is a link for the kindle edition: [link].

3. That link goes to a page for what appears to be an entirely different book. (Under the same name; this was an edition of the Arabian Nights.)

4. However, I have independently found this page: [link], which appears to be for the kindle version of the book I'm interested in.

5. Given that I want to buy the kindle version of the book linked up in step (1), which one should I purchase?

The agent directed me to buy the book that purported to be the book I wanted, instead of the book that Amazon believed was the book I wanted but which claimed to be something different. I would have assumed that anyway. But a couple days later I checked on the book and the "kindle version" link for the paper version had been corrected.

Unfortunately, while they did correct the issue on the one book that I took the time to point out to them, it's still rampant all over their website.

codeduck|1 year ago

shouting "speak to a fucking human" repeatedly seems to work, though i may be suffering from confirmation bias

RheingoldRiver|1 year ago

type `agent` repeatedly into the chatbot and it will let you request a callback

ackfoobar|1 year ago

"Getting through chat bots to get to a human" is the new "getting through tech support to get to an engineer".

https://xkcd.com/806/

ASalazarMX|1 year ago

I think the path is now Chat bot -> Help Desk -> Engineer.