top | item 46845236

(no title)

sebasv_ | 29 days ago

I feel like your comment is in itself a great analogy for the "beware of using LLMs in human communication" argument. LLMs are in the end statistical models that regress to the mean, so they by design flatten out our communication, much like a reductionist summary does. I care about the nuance that we lose when communicating through "LLM filters", but others dont apparently.

That makes for a tough discussion unfortunately. I see a lot of value lost by having LLMs in email clients, and I dont observe the benefit; LLMs are a net time sink because I have to rewrite its output myself anyway. Proponents seem to not see any value loss, and they do observe an efficiency gain.

I am curious to see how the free market will value LLM communication. Will the lower quality, higher quantity be a net positive for job seekers sending applications or sales teams nursing leads? The way I see it either we end up in a world where eg job matching is almost completely automated, or we find an effective enough AI spam filter and we will be effectively back to square one. I hope it will be the latter, because agents negotiating job positions is bound to create more inequality, with all jobs getting filled by applicants hiring the most expensive agent.

Either way, so much compute and human capital will go wasted.

discuss

order

fragmede|29 days ago

> Proponents seem to not see any value loss, and they do observe an efficiency gain.

You get to start by dumping your raw unfiltered emotions into the text box and have the AI clean it up for you.

If you're in customer support, and have to deal with dumbasses all day long who are too stupid to read the fucking instructions. I imagine being able to type that out, and then have the AI remove profanity and not insult customers to be rather cathartic. Then, substitute "read the manual" for an actually complicated to explain thing.

NateEag|29 days ago

> You get to start by dumping your raw unfiltered emotions into the text box and have the AI clean it up for you.

Anyone semi-literate can write down what they're feeling.

It's sometimes called "journaling".

Thinking through what they've written, why they've written it, and whether they should do anything about it is often called "processing emotions."

The AI can't do that for you. The only way it could would be by taking over your brain, but then you wouldn't be you any more.

I think using the AI to skip these activities would be very bad for the people doing it.

It took me decades to realize there was value in doing it, and my life changed drastically for the better once I did.