top | item 46392423

(no title)

tomlue | 2 months ago

I think what all theses kinds of comments miss is that AI can be help people to express their own ideas.

I used AI to write a thank you to a non-english speaking relative.

A person struggling with dimentia can use AI to help remember the words they lost.

These kinds of messages read to me like people with superiority complexes. We get that you don't need AI to help you write a letter. For the rest of us, it allows us to improve our writing, can be a creative partner, can help us express our own ideas, and obviously loads of other applications.

I know it is scary and upsetting in some ways, and I agree just telling an AI 'write my thank you letter for me' is pretty shitty. But it can also enable beautiful things that were never before possible. People are capable of seeing which is which.

discuss

order

WD-42|2 months ago

I’d much rather read a letter from you full of errors than some smooth average-of-all-writers prose. To be human is to struggle. I see no reason to read anything from anyone if they didn’t actually write it.

tomlue|2 months ago

If I spend hours writing and rewriting a paragraph into something I love while using AI to iterate, did I write that paragraph?

edit: Also, I think maybe you don't appreciate the people who struggle to write well. They are not proud of the mistakes in their writing.

Capricorn2481|2 months ago

> These kinds of messages read to me like people with superiority complexes. We get that you don't need AI to help you write a letter. For the rest of us, it allows us to improve our writing, can be a creative partner, can help us express our own ideas

The writing is the ideas. You cannot be full of yourself enough to think you can write a two second prompt and get back "Your idea" in a more fleshed out form. Your idea was to have someone/something else do it for you.

There are contexts where that's fine, and you list some of them, but they are not as broad as you imply.

buu700|2 months ago

As the saying goes, "If I'd had more time, I would have written a shorter letter". Of course AI can be used to lazily stretch a short prompt into a long output, but I don't see any implication of that in the parent comment.

If someone isn't a good writer, or isn't a native speaker, using AI to compress a poorly written wall of text may well produce a better result while remaining substantially the prompter's own ideas. For those with certain disabilities or conditions, having AI distill a verbal stream of consciousness into a textual output could even be the only practical way for them to "write" at all.

We should all be more understanding, and not assume that only people with certain cognitive and/or physical capabilities can have something valuable to say. If AI can help someone articulate a fresh perspective or disseminate knowledge that would otherwise have been lost and forgotten, I'm all for it.

tomlue|2 months ago

This feels like the essential divide to me. I see this often with junior developers.

You can use AI to write a lot of your code, and as a side effect you might start losing your ability to code. You can also use it to learn new languages, concepts, programming patterns, etc and become a much better developer faster than ever before.

Personally, I'm extremely jealous of how easy it is to learn today with LLMs. So much of the effort I spent learning the things could be done much faster now.

If I'm honest, many of those hours reading through textbooks, blog posts, technical papers, iterating a million times on broken code that had trivial errors, were really wasted time, time which if I were starting over I wouldn't need to lose today.

This is pretty far off from the original thread though. I appreciate your less abrasive response.

minimaxir|2 months ago

That is not what is happening here. There is no human the loop, it's just automated spam.

tomlue|2 months ago

good point. My response was to the comment not the OP

nkrisc|2 months ago

Well your examples are things that were possible before LLMs.

tomlue|2 months ago

This is disingenuous

amvrrysmrthaker|2 months ago

What beautiful things? It just comes across as immoral and lazy to me. How beautiful.

qnleigh|2 months ago

> People are capable of seeing which is which.

I would hazard a guess that this is the crux of the argument. Copying something I wrote in a child comment:

> When someone writes with an AI, it is very difficult to tell what text and ideas are originally theirs. Typically it comes across as them trying to pass off the LLM writing as their own, which feels misleading and disingenuous.

> I agree just telling an AI 'write my thank you letter for me' is pretty shitty

Glad we agree on this. But on the reader's end, how do you tell the difference? And I don't mean this as a rhetorical question. Do you use the LLM in ways that e.g. retains your voice or makes clear which aspects of the writing are originally your own? If so, how?

trinsic2|2 months ago

I hear you. and I think AI has some good uses esp. assisting with challenges like you mentioned. I think whats happening is that these companies are developing this stuff without transparency on how its being used, there is zero accountability, and they are forcing some of these tech into our lives with out giving us a choice.

So Im sorry but much of it is being abused and the parts of it being abused needs to stop.

tomlue|2 months ago

I agree about the abuse, and the OP is probably a good example of that. Do you have any ideas on how to curtail abuse?

Ideas I often hear usually assume it is easy to discern AI content from human, which is wrong, especially at scale. Either that, or they involve some form of extreme censorship.

Microtransactions might work by making it expensive run bots while costing human users very little. I'm not sure this is practical either though, and has plenty of downsides as well.

simonask|2 months ago

I’m sorry, but this really gets to me. Your writing is not improved. It is no longer your writing.

You can achieve these things, but this is a way to not do the work, by copying from people who did do the work, giving them zero credit.

(As an aside, exposing people with dementia to a hallucinating robot is cruelty on an unfathomable level.)

cm2012|2 months ago

Do you feel the same about spellcheck?

tomlue|2 months ago

> I’m sorry, but this really gets to me. Your writing is not improved. It is no longer your writing.

Photographers use cameras. Does that mean it isn't their art? Painters use paintbrushes. It might not be the the same things as writing with a pen and paper by candlelight, but I would argue that we can produce much more high quality writing than ever before collaborating with AI.

> As an aside, exposing people with dementia to a hallucinating robot is cruelty on an unfathomable level.

This is not fair. There is certainly a lot of danger there. I don't know what it's like to have dimentia, but I have seen mentally ill people become incredibly isolated. Rather than pretending we can make this go away by saying "well people should care more", maybe we can accept that a new technology might reduce that pain somewhat. I don't know that today's AI is there, but I think RLHF could develop LLMs that might help reassure and protect sick people.

I know we're using some emotional arguments here and it can get heated, but it is weird to me that so many on hackernews default to these strongly negative positions on new technology. I saw the same thing with cryptocurrency. Your arguments read as designed to inflame rather than thoughtful.