I think you should care more about bad actors potentially brigading HN, instead of asking me random questions if I think if this article was written by AI or not.
For me the article is front page worthy (in fact it had quite a lot of upvotes) as it brought an interesting point of view and an interesting discussion.
I dont care if it wqs written by a human, or maybe rewritten by some tool.
Substance over form!
For me it looks that the AI-proponents were unhappy with this article so they mass flagged it. Why they dont like it? Because this article has a heavy anti-AI stance.
The point is that the topic doesn’t matter, and whether you or I like the content doesn’t matter. The HN community and moderation team have come to a consensus that only original human-authored writing has a place here.
There are only 30 places on the front page, and thousands of submissions each day trying to take one of them. It’s reasonable for the audience to expect that a post that has made it to the front page has had sufficient effort invested in it to be deserving of that place.
Something else I’ve noticed (just today, in part due to this subthread): people are far less inclined to feel negatively towards an LLM-generated article or comment if they agree with it. We need to consciously resist being influenced in this way.
rvba|1 day ago
I think you should care more about bad actors potentially brigading HN, instead of asking me random questions if I think if this article was written by AI or not.
For me the article is front page worthy (in fact it had quite a lot of upvotes) as it brought an interesting point of view and an interesting discussion.
I dont care if it wqs written by a human, or maybe rewritten by some tool. Substance over form!
For me it looks that the AI-proponents were unhappy with this article so they mass flagged it. Why they dont like it? Because this article has a heavy anti-AI stance.
tomhow|1 day ago
There are only 30 places on the front page, and thousands of submissions each day trying to take one of them. It’s reasonable for the audience to expect that a post that has made it to the front page has had sufficient effort invested in it to be deserving of that place.
Something else I’ve noticed (just today, in part due to this subthread): people are far less inclined to feel negatively towards an LLM-generated article or comment if they agree with it. We need to consciously resist being influenced in this way.