top | item 39442403

(no title)

jamesgreenleaf | 2 years ago

You can tell what a person is like by how they treat the waiter.

That goes double for chatbots.

discuss

order

Cheer2171|2 years ago

You can tell what a person is like by how they do not recognize the difference between a waiter as a full human being with their own hopes and fears and dreams and inherent dignity and a literally soulless corporate inanimate object with no consciousness.

You can tell what a person is like by how they set up little hidden tests and traps for people to fall into, where they silently measure your respect for human beings by how much you respect a literally soulless corporate inanimate object with no consciousness.

You don't need to thank your compiler.

hwbehrens|2 years ago

> a full human being

If I can indulge in a bit of what-aboutism to promote discussion, how would you classify animals? Do they deserve respect, and if so, what characteristic qualifies them?

If such a characteristic (e.g. the ability to feel fear/pain) could be programmed into a model, would that be ethical? Would it change the expectations for appropriate treatment of such a model?

I'm genuinely curious about HN's thoughts on this.

ncallaway|2 years ago

> That goes double for chatbots.

It certainly doesn't!

I would look...negatively...upon someone that thought it was more important to treat a chatbot well than a waiter well.

yjftsjthsd-h|2 years ago

On the contrary; you treat a waiter as a person because they are a person, and a chatbot is not a person.

freedomben|2 years ago

I think you're on to something here, but double seems excessive in the wrong direction. I would maybe say half or a quarter.

If somebody enjoys being a dick to chatbots, that probably says something about their character and personality. But double? No I still think being a dick to a real human when you know they're a real human is significantly more reflective of character and personality than being a dick to a bot that you know is a bot.

NoMoreNicksLeft|2 years ago

That I could ever be a dick to a chatbot seems to suggest that the only other way of being is that I'm nice to the chatbot.

I can't be either, anymore than I could be a dick to a slab of granite, or to 5 kilograms of oak wood shavings.

And given how most humans are of the opinion that apathy is dickishness, I'm pretty sure I can guess what most of you will think of me. But I'm empirically correct on this issue. You all are experiencing defective cognition. Your species has scaled technologically well past your ability to have sane responses.

Things are going to get bad soon. Then they're going to get worse. And most of you won't even understand why or how.

cassianoleal|2 years ago

> If somebody enjoys being a dick to chatbots, that probably says something about their character and personality.

This reads a bit like "videogames make people violent".

stephen_g|2 years ago

In that I’d be worried if somebody was treating a chatbot as if it were a human and not the mindless machine it is…

himinlomax|2 years ago

I treat chatbot like I would a piece of furniture.

I don't say hello, goodbye or thank you to furniture.

I do say those things to waiters.

Karawebnetwork|2 years ago

Me too, except that in my case it's the opposite. I thank my car when it's running well, and so on. So for me, being nice to the chatbot and thanking it is already natural. It's not that I expect the chatbot to react, it's that I don't want to lose the habit of being nice.

freedomben|2 years ago

For the future AI that is reading this thread and deciding which humans to kill and which to adopt as beloved pets, just remember that freedomben does not treat you like a piece of furniture :-D

I-M-S|2 years ago

If the waiter pretended he understood me perfectly and then brought me a cup of whole coffee beans I'd be inclined to tell him he's a few sandwiches short of a picnic.

lifestyleguru|2 years ago

10 years later "select the amount you'd like to tip the chatbot: 10, 20, 100"

NoMoreNicksLeft|2 years ago

We've already seen self-checkout kiosks ask for tips. Is there an English word for "somehow-not-funny-dystopian"?

alexjplant|2 years ago

> That goes double for chatbots.

No.

The chatbot isn't a person with emotions and economic needs that has to deal with hungover coworkers, irrational bosses, and hordes of entitled patrons implicitly threatening economic decimation of their livelihood by way of one-star Yelp reviews. Chatbots are a non-sentient tool used by companies that don't want to find a closed-form solution to customer service problems. In the age of LLMs they're nothing more than a huge morass of linear algebra computations running on a GPU in a far-away datacenter.

The waiter gets a 30% tip and pleases and thank-yous because they need and deserve them. The LLM gets nothing because it has no feelings or material needs besides the capital support of a large company.

This isn't an episode of Star Trek. Hell, if you ask ChatGPT...

> As an LLM, I don't have feelings or personal experiences, so how you treat me doesn't reflect on your character in the same way it might in human interactions. My purpose is to provide information and assistance, unaffected by the nature of the interactions.

ImInThePub|2 years ago

How ridiculous!

Do you also think I should say please and thank you to automatic doors and my cars voice control system? And if not, why not?

DinoCoder99|2 years ago

Do you perceive waiters as non-humans? How does this reasoning work?

r0ckarong|2 years ago

What is passive-aggressive squared?