top | item 1852560

Tell HN: Uptick in mean-spirited comments

155 points| mjfern | 15 years ago

I've been visiting Hacker News for just under two years, and recently I've noticed a significant uptick in mean-spirited comments. This echos a recent comment by PG, "There are more dumb and/or mean comments than there used to be..."

As you are commenting, or up-voting and down-voting comments, please be conscious of how comments might affect others. This is a fantastic community of incredibly bright people. We should be able to engage in thoughtful debate and discussion, while remaining friendly and polite.

Thank you everyone!

278 comments

order
[+] pg|15 years ago|reply
Incidentally, if anyone has any suggestions for technical changes/features/tricks that would help fix the problem, I'm all ears. Fending off the decline everyone thinks is inevitable for forums is the main thing I work on. It has been since practically the beginning.
[+] maxklein|15 years ago|reply
I believe that the less anonymous a forum like this is, the more polite it will be. You have said you want people to speak to each other here as they would in the real world: well, I believe the more strongly you tie their real world identity to their account here, the more polite they would be.

My suggestion, similar to that experiment in the past, and similar to twitter, add a "verified user" badge in the users profile. However, to become verified requires a simple step: add a link to your facebook, linkedin or xing profile. Any already verified user can then verify you.

Non-verified users then have a very minor voting hit, for example a comment karma cap at 50 or so.

This will encourage people to associate their real life accounts with their hacker news accounts, and will result in conversation that more accurately reflects what one would say in real life.

The fundamental dynamics of the site would not change, since the advantage to being verified are not that major.

[+] chaosmachine|15 years ago|reply
Downvoted comments are bad for the community. They are literally negative. Remove downvotes. Anything that truly deserves downvotes probably deserves flagging/deading, too.

Downvotes change how you think about reading comments. Instead of "was this helpful?" you're thinking "do i love this or hate this?". It's polarizing, and as the number of users increases, it's easy for a comment to hit -4 in a couple minutes.

People feel bad when they get downvoted (often because they feel the downvotes are unjust), and some of them turn to trolling or flaming as a defense mechanism. This spreads across threads and infects the whole community. Emotions are contagious.

Also, add the ability to undo your upvote.

[+] davidw|15 years ago|reply
Since you asked... I've always thought you should be a bit more rigorous about eliminating the sorts of "rah rah!" topics that get people fired up. That's not a technical solution, but it's not really a technical problem either.

What I mean by "rah rah" things includes political articles that make people think "right on!" Paul B's article on Proposition 19 falls into that category for me: I strongly agree with him.

But when I step back and look, I think that I don't want to read everyone's "rah rah" articles because they tend towards flame wars, and, if opposing views are trampled to the point where those people go elsewhere, group think. I also think that, since they are topics that anyone can have an opinion on, that they are much more likely to draw people only marginally interested in the core topics of "hacker news". Those people in turn will submit more articles about politics. At a certain point a few days ago, between Prop 19, John Carmack's libertarianish views, and something else I don't recall, the top three spots on HN were occupied about articles that were essentially political in nature.

I would much prefer that articles like that one, much as I agree with it, and think Paul B is an awesome guy and a great contributor here, were ruthless pruned from this site.

[+] GHFigs|15 years ago|reply
Replace the total karma score by the username in the upper right of the page with the average that's already visible on the profile page.

An average is a better indicator to the user of how well they are engaging with the site. It only goes up if the user consistently posts worthwhile comments, which rewards being selective about when you post and thoughtful about what you post. Fewer and better is encouraged. Consistently good is encouraged. Lousy comments (like one-on-one arguments and "Me too!") are discouraged even if nobody votes them down.

This bridges the gap between two existing forms of feedback: total karma (which nobody really cares about, even though it's on every page) and the voting on individual comments (which is the closest thing we have to an indicator of comment quality), and gives the user continuous feedback on how much value they are adding to the site. It makes it much easier to know if your commenting is getting better or getting worse. That is, it is much easier to tell that my average is up or down from the last time I looked than it is to detect the fluctuation in the rate at which my karma accumulates.

Even if lousy comments get voted up--as happens when, for instance, someone first-posts an opinionated quip on a controversial submission and people swarm to vote in agreement--the impact on the user's average karma (which better reflects their value to the community) is much lower than the immediate spike in their total karma (the feedback they currently see). You don't have to stop people from writing such comments--just give them a smaller cookie when they do.

Obviously you can't make anybody alter their behavior over a number, but I think most here would if given a number that they could trust as representative of the shared goal of keeping this place from sucking.

tl;dr average karma is better feedback than total karma

[+] demallien|15 years ago|reply
I think the problem comes from the fact that we don't have a tool for expressing that we disagree with the tone of what someone is saying, but we agree with the position of the comment. This means that someone can write a mean spirited comment that happens to support a popular position in this community, and they won't get downvoted. Other people, seeing this, will take it as permission to be rude in their own posts.

I would suggest having a "tone" button that downvotes for tone. The level of fading applied to a comment will be bases on whichever of the two scores is the lowest, tone or voting - so a post with 10 upvotes but -4 tone will be invisible. To avoid abuse of the "tone" feature, it should be possible to recover the user names of those that flag a post for tone.

[+] staunch|15 years ago|reply
Try to prevent people from feeding the trolls.

When people reply to a -4 comment it makes it harder to ignore that comment. It effectively pushes it up on the page. So disable replying to -4 comments. It's probably also necessary to hide any existing replies.

Exhibit A: http://news.ycombinator.com/item?id=1852305

[+] spuz|15 years ago|reply
How about you add a little note just above the reply button. Something like:

  Does this comment add value/insight/healthy debate to this topic?
Very low tech I know, but sometimes people need a little reminder to think about the tone of their message and how it will be received by the community before they hit that reply button.
[+] ajb|15 years ago|reply
Okay, here's a trick. The goal is to detect people who can't discriminate bad stuff from good, can't restrain themselves from replying to a troll (or can't identify when they are doing so).

The mechanism is, counter-intuitively, to allow people to rate their own comments. However, this rating is not to be directly used to affect how the comment is displayed, since that would obviously be easy to game. Instead, it would be used to identify people who can't restrain themselves from engaging in a low quality discussion; ie, replying to a troll or similar.

Obviously we can't expect trolls to rate their own post as trollish, but we can expect at least some people to have the self-knowledge required to admit when they are posting while annoyed or similar.

So, the system assumes that a comment is only as good quality as its parent, or at least, not much better. Obviously, one can make a good contribution in reply to a bad one, but it's not necessarily a problem to discourage this - since the reader of such a contribution has to read the bad contribution to get to the good one.

Internally, each user has their own scale, which their replies - and the replies to their comments - relate to the scales of other users, using the above assumption. A user would then be shown only comments to which they are likely to reply; this means trolls see all the trolls, but quality users see only other quality comments.

There are a couple of potential problems. One is that it would bias people against replying to deeply nested comments; and ultimately, towards making submissions instead of replies. One rather drastic way to deal with this would be to remove the submission mechanism, and only have one conversation tree; the frontpage items would then be the roots of current discussions inferred from the structure of the tree.

The second problem is that while this mechanism would help current users filter out trolls, it would do nothing to help new users find the best discussions. There is a conflict between preventing the system being gamed, and allowing new users to find the best discussions; because the former would prefer the ratings to be secret, but the latter would allow information about them to be public. It is not obvious that there is a workable trade-off.

[+] ramanujan|15 years ago|reply
You have the full HN dataset and voting logs. What if you used it to train a simple classifier that would predict the likely karma of a comment as a function of a number of features, including username, username responded to, bag-of-words vector of the comment, time of day, total number of comments in thread, "controversy" of the comment (as reflected by a mix of up & down votes) and similar factors?

The goal could be something like Google's Beer Goggles, but for comments: "it looks like you're writing a flame, are you sure you want to do that?"

Another thought: there's a big difference between a comment at -1 with 10 upvotes and 12 downvotes vs. one that has occasioned 2 downvotes. Might be interesting to see the effect of showing both the number of up & down votes rather than just the sum. The idea is that genuinely trollish comments at -1 will be differentiated from minority points of view.

[+] Sukotto|15 years ago|reply
Griefing: If the system does not already, I think it should flag for review users who consistently downvote a particular user.

Cliques: Likewise if the same few people are consistently the first to upvote stories or comments of a particular user.

Multies/Sockpuppets: I think votes for a user buy another user from "too close" an ip address should be automatically discounted. I'm not sure how to define "too close" (and I suspect HN already does this)

Anonymous posting: I like the way MetaFilter handles anonymous posting instead of people making throwaway accounts. The downside of this is that 1) Posters have to really trust the site admins; 2) This becomes more work for the admins

Disallow noob posting articles: What % of the [DEAD] articles were posted by users with a very low karma? What % hit the front page?

(Yes, those last two are not directly related to the OP's comment but I feel they're ideas worth throwing out there)

[+] rjurney|15 years ago|reply
Flagging an entire thread by clicking 'troll' would help. The up/down voting tends to promote trolling, as the first controversial cliche is the top comment, having spawned a deep threaded conversation with lots of emotions and voting. A different symbol is needed. Something that makes people take pause and say, 'Is this a good subtopic of this link, or is it the exact debate anyone would have predicted?'

Put troll threads/comments at the bottom of the page, or hide them. Or use spam ratings to amplify users' votes who up vote non spam comments.

Short of that, add the ability to minimize entire threads.

[+] paolomaffei|15 years ago|reply
It was already suggested often that people should "pay" karma to post and maybe even to comment

Users should be encouraged to comment LESS, only if they have something useful to say.

That's what I'm doing anyway, because I really don't care about karma and speak only when I have something worth to say, would be cool if anyone was acting like this since we have no shortage of users.

[+] edw519|15 years ago|reply
You're almost there with your "reply time delay". Just take it all the way...

Limit the number of times one person can post in the same thread to once (or maybe twice), either from the top node or a sub-node. Example:

PersonA says, "blah blah blah..."

PersonB says, "You're an idiot because blah blah blah..."

PersonA is locked out from replying to PersonB.

This will do 7 things:

1. It will eliminate argument strings between 2 people.

2. It will remove the incentive to flame because your flame can't turn into an argument.

3. It will encourage people to think through what they have to say, anticipate counter-arguments, and address them in their original comment (or with an update in the 2 hour edit window).

4. It will foster a greater sense of community by encouraging people to stick up for each other. (PersonA made a good comment, PersonB trolled, PersonA can't reply, so I will.) PersonB will get the message much better when confronted by the community instead of PersonA. (Of course, if no one sticks up for PersonA, then maybe they really were wrong.)

5. It will encourage 2 people to take their legitimate discussion (of little interest to others) off-line.

6. It will reduce the length and indentation of threads making them easier to read.

7. It will encourage everyone to slow down and think a little. Those who don't will stick out (not in a good way). Trolls will get more satisfaction somewhere else.

(This is already working by accident. I have gotten so busy that I only visit hn in short bursts. Often, when someone challenges me, I don't reply because I have left and returned much later. But others have replied for me. The resulting thread is invariably better than if I had replied myself.)

[+] 8ren|15 years ago|reply
Two sets of scores and arrows: one for agree/disagree, the other for high/low quality. Separating the concepts would emphasize the issues as distinct, and also make them easier to vote on. The high "quality" score would act like the current "flag" button, but with a finer control. Although a much more complex UI model, it's maybe worth trying.
[+] JesseAldridge|15 years ago|reply
Possible solution 1: Smaller communities; Dunbar's number. I know it's been suggested before, but here it is again... When everybody knows everybody, people are nicer. Just create one unique, isolated instance of the site for every X number of users. Distribute current users randomly (or whatever) among the instances. Assign new users to the least populous instance. If an instance gets over max_number of users, split it in two.

Don't bother trying to do fancy stuff like connecting votes between communities; if something is really good people from each community will submit it separately. Also, it might be nice to be able to move between communities... but that sounds like something for version 2.

You could have these smaller communities run in parallel with the main site if this seems like too big of a shift.

Possible solution 2: Not sure if this really addresses the issue squarely -- but what if, when ranking stories and/or comments, you factor in the number of votes in the intersection between the current user and the submitter/commenter. Something like:

define votes(username): return ids of last n upvoted items for username

item_score *= (length(intersection(votes(user_x), votes(user_y))) / (n / 2))

Just use the original algorithm if the user hasn't yet cast n votes.

One man's garbage is another man's gold, as they say.

You could do something similar by checking downvotes as well. If two people are polar opposites (one person upvotes stuff the other downvotes), then comments by one could be greyed out or hidden for the other.

If there were simply a box I could check to make my votes public, then maybe somebody could write a Chrome or Firefox extension to pull this data and handle the re-ranking.

Neither of these two ideas would be perfect, but I think they would improve the user experience significantly. If nothing else they would be interesting experiments.

[+] alphamerik|15 years ago|reply
Have you considered charging for write access? Allow the site to be publicly readable but charge people a yearly fee to submit stories and posts. They would be much less likely to act abusive if they are paying and their access could be revoked. Several membership levels and tiers of forums could be provided, including private and invite only.
[+] jdietrich|15 years ago|reply
I havent figured out anything specific, but I have a couple of thoughts about the karma system.

It occurs to me that the main failure of karma-type systems in general is valuing quantity over quality. This is perfectly rational for a commercial site that seeks to maximise engagement and page views, but it seems to actively harm both comment quality and the community dynamic.

It seems that the prime concern we have for HN is maintaining a high S/N ratio; Why should the karma system not make that desire more explicit? I may be completely wrong here, but I think most of us would like to see a community where people think carefully before they post. If we want to discourage flaming and poor-quality comments, I think we need to think harder about behavioural prods to discourage them.

I propose removing the display of a users' total karma on their user page and replacing it with their average karma. I would like to see voting privileges made contingent on average karma and would be interested to hear possibilities for punishing users with a high proportion of low- or negative-karma comments.

It is of course dangerous to use heavy-handed technical changes because of the myriad possible unintended consequences, but we all know that internet communities almost inevitably fail. As a community, we should be considering our options now while things are generally OK, perhaps even designing a 'nuclear option' for the possibility that HN experiences some sort of Eternal September scenario.

[+] justlearning|15 years ago|reply
is it too late for me to chip?

if a user is downvoted>n times, delete the comment with added penalty factor. this would keep the content clean forever.

Despite the downvoted comment being hidden, it almost stands out the elitist comment(and nowadays the chain of explanations growing usually offtopic)

Could we restrict users to one user per ip? I sometimes suspect people using multiple ids to upvote their comments/posts. There are some posts way out of HN league that get upvotes within minutes from the new page (even the spammer ones).

Would it be hard to have new users moderated? One way is for new users to write few lines about themselves. Moderator would look for human(no-spam) and genuinity.

Show a warning or part of content from http://ycombinator.com/newsguidelines.html upon submission of a post or comment - until they attain their downvote karma rights.

One liners like 'i like it too' should be deleted. Perhaps there should be a minimum amount of words for a reply or a warning telling the user to verify if he really wants to post.

Link searchyc.com on top of the page beside 'submit' - new users don't know of this awesome search site. While there is a search link at the bottom of the page, I wouldn't be using it more than once that I know it's googlified. SearchYC works! with lots of criteria. imho, this would reduce many of'django vs rails' kind of questions, that have been discussed in the past.

[+] noodle|15 years ago|reply
i'd like to see two things:

1) a metric based on account age, submission karma, and comment karma. less weight on submission karma, since that is easier to manipulate. make this metric the barrier of entry for features instead of basing it purely on karma. it makes manipulation harder, and it encourages actual community participation/development.

2) add another meta-moderation tool similar to the "flag" option. i mostly use flagging for spam or offensive posts. i'd like to see something else (unlocked by #1) that allows you to mark something that is off topic, non-contributing, or otherwise adding to "the decline". accounts accumulate these like points, and the more points an account has, the more features get revoked from it as punishment (which leads up to a temp ban). points decay with time, and have a hard cap on how many can be obtained in a single day or by a single comment, but a long history of points leads to more points being obtained per flag.

edit: just to add some thoughts -- the only way to prevent people from being anonymous internet idiots and adding noise to the signal is either with a carrot or a stick. there's not much we can offer as a carrot that other sites can't, or that they can't get by non-participation (just reading the site). that is, unless you want to implement a private karma-barrier section of the site (unless you already have an i'm not in it... :( but then i wouldn't know about that carrot and wouldn't change my actions to reflect wanting to obtain that reward).

the only real alternative is the stick, hence the revoking of features.

[+] metamemetics|15 years ago|reply
keep track of submission karma and comment karma separately. People should not be able to downvote in comments unless they have earned the karma threshold required to do so from comments.

Ex [of what to prevent]. someone submits the latest tech-crunch article first and gets upvotes from everyone else trying to submit it. They can then downvote in comments without first being subjected to a socialization period of earning upvotes from thoughtful comments.

[+] DeusExMachina|15 years ago|reply
Educate people on flagging inappropriate comments, like offensive ones, instead of downvoting. Downvoting can be used for disagreement, but nasty comments deserve to be flagged and then eliminated.

Then, put a heavy fine on it: for example, if a user has a comment eliminated due to flagging, he loses a big amount of karma (20%?) and he cannot comment/submit for one day.

[+] cageface|15 years ago|reply
I'd like to see the karma system dispensed with entirely. I don't think it encourages thoughtful and productive discussions.
[+] ihodes|15 years ago|reply
I want to measure the worth and contribution of a comment; here's how I tried.

    (defn v-rat
        [uv dv] 
        (/ (expt uv 3/2) 
           (+ uv dv))))

    (define nest-weight
        [comment]
        (/ 1 (+ (nest-level comment) 
                (/ 3 (nest-level comment)))))

    weight::
    (+ (/ ak 2)
       (v-rat uv dv)  
       (reduce + (map (fn [c] (* (nest-weight c) 
                                 (v-rat (upvotes c) (downvotes c))))
                       (remove (fn [c] (= user (user c)))
                               comments))
Where ak is the average karma of the user (only on comments, if possible), uv and dv are upvotes and downvotes respectively, and comments is a list of comments in reply to the comment we're weighing.

=====

Abstractly, I'm trying to consider the comment based on both its individual value as well as how much good discussion it generated. I also take into account, plainly, the average past worth of their comments.

This could certainly use some tweaking, but I like the values I've plugged in so far, and it promotes discussion as well as thoughtful contribution, and gives a nod towards users who have high quality comments in the past.

I'd be interested in tweaking it some more if this seems at all a step in the right direction. I think I could use a better heuristic for replies, as well as a more sophisticated v-rat. I also wonder if it would be possible to see which users downvote and upvote, and give those votes more power (technically possible, that is), and also take into account the users who are replying, and to what effect. This would require a lot more testing, and maybe more CPU cycles than PG would like.

I'm not sure taking time into account is a great idea past the first ~10 minutes of submission. For those first ~10 minutes, perhaps weigh the comment solely using the avg karma of the user + some constant. After those first x minutes, rank by the algorithm.

[+] sharpn|15 years ago|reply
I have a suggestion: after first creating an account, users could be redirected to the 'guidelines' page. That might help frame the tone of subsequent comments.
[+] fizx|15 years ago|reply
Can we hide negative comments, ala reddit, slashdot, everywhere else? I hate yielding real estate to trolls.
[+] spinlock|15 years ago|reply
If I could fix this I'd be launching the next billion dollar start-up around the technology. Just about every site goes through this type of growing pain: in the beginning it's obscure and spreads by word of mouth so that all of the people who visit the site share a common ethos; then, it grows and starts to attract other people who do not share the spirit that started the site; eventually, the "new" users outnumber the old users and they begin to set the standard for the sites personality.

You can try to raise awareness and ask people to not be evil on your site, but that's almost as effective as asking the tide not to come in. There are many bitter people in the world and they outnumber the well adjusted people who would rather spend their time building up a project instead of tearing down someone else's. These trolls crave attention - the negative attention they get by aggravating people on-line - so it's really not affective to point out the problem and ask them to stop or to ask the community to gang up on them; that's what they want.

We need a solution that frustrates these people. I've seen some sites where the comments a troll posts look like they are live on the site to the troll but they are hidden from everyone else. I think this is a very effective technique because it denies the troll the attention he is seeking and it saves the rest of the site from his attacks.

But, perhaps the best response is to remember Margaret Thatcher's advice about people trying to make you angry (i.e. no one can make you angry without your permission). If you see something that's not in the spirit of hacker news, just ignore it. Don't respond; don't show it to other people; just let it die a bitter and lonely death.

[+] Kaizyn|15 years ago|reply
Why not do something quite simple. Have it setup so that if a user has over a certain karma threshold they can flag a user's post in some manner as being mean-spirited or other such. If a certain number of users also flag the comment, then the poster does not get any of the karma points their comment may have earned. Alternately, the commenting flagging could be coupled with a slashdot style moderation and meta-moderation system.
[+] WiseWeasel|15 years ago|reply
Downvotes could be made precious by limiting the number one could make each day, ensuring that people will use them more carefully.

It could then be conceivable to give the downvotes more weight, perhaps putting users in a karma hole, with their posts starting out at 0 points, appearing at the bottom of the comments page, if they have accrued too many downvotes. They would then have to dig themselves out by posting more valuable comments.

[+] fragmede|15 years ago|reply
Force users to give a reasons when voting.

This would allow two things. First, users can downvote for specific reasons - mean spirited/dumb/insulting/off-topic. If a comment is technically correct, but unnecessarily mean, then I can its disposition.

Secondly, it would provide feedback to the commenter. If my 'me too!' comments are being downvoted, it'd be great to have some reason why.

[+] mahmud|15 years ago|reply
FWIW, I discovered that people take disagreements personally and will stalk you across threads if you're not careful with their egos.

For some reason I am incapable of remembering names and can only identify about 10 - 15 people here other than the ones I have met or know from somewhere else. So the whole thing feels to me like detached, anonymous discussion. I was surprised to be 1) remembered, and 2) "punished" in another, unrelated context.

There are also times when I withhold contributing to a thread because of its strong fanboy/flamer possibilities. Certain companies have .. enthusiastic supporters/haters.

[+] 8ren|15 years ago|reply
pg's quoted comment http://news.ycombinator.com/item?id=1833255

I've noticed that if you reply to a mean-spirited comment with a straightforward and dispassionate counter-argument, completely ignoring its rudeness and tone, you get massive upvotes. Transcending meanness also feels really clean; it is neutralized.

By framing the meanness, the site's collegiality is not undermined, but demonstrated.

[+] ihodes|15 years ago|reply
Not trying to be passive aggressive here, at all; I agree with you here. The reason I didn't post anything was that HN has gotten pretty damn meta recently too. I'm not sure posts like this will help anything, much; probably something needs to come from the top, or the community needs to focus on enforcing the behavioral norm by down-voting (even if they normally wouldn't put in the effort) comments that are of an assholish nature.

As an aside: http://twitter.com/#!/ihodes/status/29059931658

[+] tlrobinson|15 years ago|reply
I've been on HN for a few years, and I'm usually extremely skeptical of people claiming a decline in the quality of HN, but I agree, and furthermore I have noticed an increase in acceptance of clever but empty Reddit-esque one-liners. I don't mind them, but I think it's unfortunately a very slippery slope...
[+] jrockway|15 years ago|reply
No social news site is going to be perfect. If it's people that are writing comments, some comments will be good and some will be bad. Read them and decide for yourself. If a comment hurts your feelings, get over it. If you like a comment, upvote it, and help other people find the best comments.

Really, I kind of get tired of reading stuff like this because there is always some subtext to it. "mean-spirited" usually means, "someone disagreed with me". If people disagree with you, you can either get over it or learn to argue better. Passive aggressive, "oh noes everyone is so mean" is not going to change what people think. Good writing will.

[+] daeken|15 years ago|reply
I upvoted this because I agree with the concept, but I don't believe such things are necessary. General rule of life: assholes will be assholes. This isn't going to make anyone more polite. Most of us are already polite, and quite content to simply downvote people who aren't.
[+] _delirium|15 years ago|reply
This is only for a subset of comments, but one common case that's sometimes sort of accidental: a commenter disagrees with a submitted article strongly, but doesn't realize that the article is written by the submitter or by another HN user. If they had, they probably would've phrased their disagreement differently.

Of course it's nice to be polite when disagreeing with any article, but I notice myself making more of an effort to be polite when disagreeing with an HN user's own submission, compared to disagreeing with, say, a New York Times article, because the NYT reporter isn't present or going to read my comment. HN's greater volume of user-written content (compared to, say, reddit) makes this something not everyone expects to encounter; it's like ranting about how Daikatana sucks and then realizing that John Romero was standing right next to you, and then you feel bad.

(This doesn't explain mean-spirited comments that are responding to other comments, though.)

[+] sp4rki|15 years ago|reply
Sometimes mean spirited comments can be full of usable ideas, or at least they can fuel a debate between contrasting opinions. Remember that what looks to be a mean spirited comment by you, might be taken a completely different way by someone else.

In any case I'd say that more important than mean comments is a the downvote everything I dislike regardless of if it contributes to the discussion in any way. It stalls actual conversation between different opinions.

[+] pointillistic|15 years ago|reply
Correct, the "mean-spirited comments" are part and parcel of every open online community. This is the flip side of the anonymity and what Jaron Lanier calls "the mob switch" (google it). In other words, human tendency to gang-up/gang-down, up-vote/down-vote in batches, regardless a content.

I also noticed that often here the polite but critical comments are down-voted if they don't share the "main theme" of a post. Again, this is the part and parcel of every online community.The alternative are the one-liners and up-likes you get on Facebook, the same Facebook that coincidently is responsible for the internet-wide decline of the commenting culture.

[+] FiddlerClamp|15 years ago|reply
I have become a lot more circumspect since I've been on the receiving end of things. I used to slam books I hated having read on Amazon, but after being published, I realized how crushing something like that can be.

Just because I think www.linttrapsforsale.com is a dumb idea doesn't mean that there wasn't a team behind it that thought they had a great vision, emitted blood and sweat in pursuit of it, and hung their hopes and dreams on it.

Short of replying to things that are actively malicious (bigoted, etc.), I try to rein in my online 'road rage' and let other people take things to task. There are so many of them out there willing to do it, in any event...

[+] mvalente|15 years ago|reply
Reasons: "The difference between the two societies is that in the society which performs poorly:

a) the stupid members of the society are allowed by the other members to become more active and take more actions; b) there is a change in the composition of the non-stupid section with a relative decline of populations of areas I, H1 and B1 and a proportionate increase of populations H2 and B2."

http://www.searchlores.org/realicra/basiclawsofhumanstupidit...

Solutions: "1.) If you were going to build a piece of social software to support large and long-lived groups, what would you design for? The first thing you would design for is handles the user can invest in. 2.) Second, you have to design a way for there to be members in good standing. Have to design some way in which good works get recognized. The minimal way is, posts appear with identity. You can do more sophisticated things like having formal karma or "member since." 3.) Three, you need barriers to participation. This is one of the things that killed Usenet. You have to have some cost to either join or participate, if not at the lowest level, then at higher levels. There needs to be some kind of segmentation of capabilities. 4.) And, finally, you have to find a way to spare the group from scale."

http://www.shirky.com/writings/group_enemy.html

[+] stukhomsimdrone|15 years ago|reply
Considering the perspective of a logged in user, simply give them the opportunity to deselect others by username and then record their deselections with their account. This wouldn't require unique pages to be served, except for a deselected_users array to be picked up and processed at the client side. The same lightweight approach can be used give shape to a user's own view with stuff like: 'preferred users', 'out-right scum' etc. Same principles apply to personal topic preferences and their viewed ranking.
[+] csomar|15 years ago|reply
As someone who was a "bad" user (under another username); posting rather useless, short, unsighted comments and submissions; I think the best way is to "educate people".

This is not only a HN culture, it's a civilian culture. Limit the freedom of newbies until they are more mature. When you are committed to the community, you become more responsible and work toward creating a name/friends. When you are new, you pretty much don't care.

[+] gsivil|15 years ago|reply
I have noticed that a negative comment(against the main point of the post) tends to be sometime initially down-voted by some superficial prejudice that it should be also mean-spirited.

It seems that the community naturally upvotes that comment to support diversity of opinions.

Of course such support tends to bring some of the provocative comments up in the list- in my view.

[+] Mz|15 years ago|reply
Incidentally, if anyone has any suggestions for technical changes/features/tricks that would help fix the problem, I'm all ears. Fending off the decline everyone thinks is inevitable for forums is the main thing I work on. It has been since practically the beginning.

My observation has been that hackers tend to look for technical solutions for moderating a forum but forum moderation is a people problem, not a technical problem. Deal with people first, tweak technical stuff second.

My experience has been that an assumption of ignorance is generally more accurate and more productive than an assumption of guilt. The antidote to ignorance is easy access to information. A prominent (top bar) link explaining how the forum works (both technically and socially) and allowing for "dumb" questions would likely do more to promote polite, thoughtful interaction than a million assumption-of-guilt based tweaks to the voting system.

I am not really a hacker, though I do know a little (x)html and css and I am not new to online forums (and have been a moderator in a forum). I found the extremely minimalist design of HN hard to fathom. I did not understand why I sometimes couldn't find the "reply" button and on several occasions I replied anyway, via another means, and then sometimes edited it once the reply button appeared. In spite of the fact that the "logo" being linked to the "home" or "front" page is standard practice, it took me months to figure out that was the link to the 'top stories' everyone talked about so much. I just didn't view the words 'Hacker News' as being part of the menu and I just didn't make the connection.

With so many more people from so many different walks of life and even countries now using the forum, there are probably quite a lot of people who, like me, just don't initially have the info they need to effectively participate. Frustrated people who don't know how to make something work often sound gruffer/have poor 'tone'. Enlightening them and assisting them will likely do a lot more for their tone than finding creative, subtle ways to secretly spank them for it (a la "the beatings will continue until morale improves").

Good luck with this.

Edit: I probably put this in the wrong place. I am not having a good day. I have added pg's remarks (at the top), which is basically what I am replying to. I hope that makes more sense. :-/

[+] codefisher|15 years ago|reply
How about people with low karma (like myself) not being allowed to comment on articles that appear on the front page. Then only people really interested in following HN, and trying to form some kind of reputation in the community would be able to get much exposure. Trolls in general I would think are much more interested in putting out comments that a number of people will see.