(no title)
tomlue | 2 months ago
I used AI to write a thank you to a non-english speaking relative.
A person struggling with dimentia can use AI to help remember the words they lost.
These kinds of messages read to me like people with superiority complexes. We get that you don't need AI to help you write a letter. For the rest of us, it allows us to improve our writing, can be a creative partner, can help us express our own ideas, and obviously loads of other applications.
I know it is scary and upsetting in some ways, and I agree just telling an AI 'write my thank you letter for me' is pretty shitty. But it can also enable beautiful things that were never before possible. People are capable of seeing which is which.
WD-42|2 months ago
tomlue|2 months ago
edit: Also, I think maybe you don't appreciate the people who struggle to write well. They are not proud of the mistakes in their writing.
Capricorn2481|2 months ago
The writing is the ideas. You cannot be full of yourself enough to think you can write a two second prompt and get back "Your idea" in a more fleshed out form. Your idea was to have someone/something else do it for you.
There are contexts where that's fine, and you list some of them, but they are not as broad as you imply.
buu700|2 months ago
If someone isn't a good writer, or isn't a native speaker, using AI to compress a poorly written wall of text may well produce a better result while remaining substantially the prompter's own ideas. For those with certain disabilities or conditions, having AI distill a verbal stream of consciousness into a textual output could even be the only practical way for them to "write" at all.
We should all be more understanding, and not assume that only people with certain cognitive and/or physical capabilities can have something valuable to say. If AI can help someone articulate a fresh perspective or disseminate knowledge that would otherwise have been lost and forgotten, I'm all for it.
tomlue|2 months ago
You can use AI to write a lot of your code, and as a side effect you might start losing your ability to code. You can also use it to learn new languages, concepts, programming patterns, etc and become a much better developer faster than ever before.
Personally, I'm extremely jealous of how easy it is to learn today with LLMs. So much of the effort I spent learning the things could be done much faster now.
If I'm honest, many of those hours reading through textbooks, blog posts, technical papers, iterating a million times on broken code that had trivial errors, were really wasted time, time which if I were starting over I wouldn't need to lose today.
This is pretty far off from the original thread though. I appreciate your less abrasive response.
minimaxir|2 months ago
tomlue|2 months ago
nkrisc|2 months ago
tomlue|2 months ago
amvrrysmrthaker|2 months ago
qnleigh|2 months ago
I would hazard a guess that this is the crux of the argument. Copying something I wrote in a child comment:
> When someone writes with an AI, it is very difficult to tell what text and ideas are originally theirs. Typically it comes across as them trying to pass off the LLM writing as their own, which feels misleading and disingenuous.
> I agree just telling an AI 'write my thank you letter for me' is pretty shitty
Glad we agree on this. But on the reader's end, how do you tell the difference? And I don't mean this as a rhetorical question. Do you use the LLM in ways that e.g. retains your voice or makes clear which aspects of the writing are originally your own? If so, how?
trinsic2|2 months ago
So Im sorry but much of it is being abused and the parts of it being abused needs to stop.
tomlue|2 months ago
Ideas I often hear usually assume it is easy to discern AI content from human, which is wrong, especially at scale. Either that, or they involve some form of extreme censorship.
Microtransactions might work by making it expensive run bots while costing human users very little. I'm not sure this is practical either though, and has plenty of downsides as well.
simonask|2 months ago
You can achieve these things, but this is a way to not do the work, by copying from people who did do the work, giving them zero credit.
(As an aside, exposing people with dementia to a hallucinating robot is cruelty on an unfathomable level.)
cm2012|2 months ago
tomlue|2 months ago
Photographers use cameras. Does that mean it isn't their art? Painters use paintbrushes. It might not be the the same things as writing with a pen and paper by candlelight, but I would argue that we can produce much more high quality writing than ever before collaborating with AI.
> As an aside, exposing people with dementia to a hallucinating robot is cruelty on an unfathomable level.
This is not fair. There is certainly a lot of danger there. I don't know what it's like to have dimentia, but I have seen mentally ill people become incredibly isolated. Rather than pretending we can make this go away by saying "well people should care more", maybe we can accept that a new technology might reduce that pain somewhat. I don't know that today's AI is there, but I think RLHF could develop LLMs that might help reassure and protect sick people.
I know we're using some emotional arguments here and it can get heated, but it is weird to me that so many on hackernews default to these strongly negative positions on new technology. I saw the same thing with cryptocurrency. Your arguments read as designed to inflame rather than thoughtful.