(no title)
openmajestic | 1 year ago
The natural evolution of this technology is to insert it into human communication channels and automatically transform raw thoughts into something better for the other end of the channel. "Better" is open to interpretation and going to be very interesting in this new world, but there are so many options.
Why not a version of HackerNews that doesn't just have commenting guidelines, but actually automatically enforces them? Or a chrome extension that takes HN comments and transforms them all to be kinder (or whatever you want) when you open a thread? Or a text input box that automatically rewrites (or proposes a rewrite of) your comments if they don't meet the standards you have for yourself?
atonalfreerider|1 year ago
I was about to start working on something like this. I would like to try browsing the internet for a day, where all comments that I read are rewritten after passing through a sentiment filter. If someone says something mean, I would pass the comment through an LLM with the prompt: "rewrite this comment as if you were a therapist, who was reframing the commenter's statement from the perspective that they are expressing personal pain, and are projecting it through their mean comment"
I find 19 times out of 20, that really mean comments come from a place of personal insecurity. So if someone says: "this chrome extension is a dumb idea, anti-free speech, blah blah blah" , I would read: "commentor wrote something mean. They might be upset about their own perceived insignificance in the world, and are projecting this pain through their comment <click here to reveal original text>"
openmajestic|1 year ago
A couple things are really interesting about this idea. First - it's so easy for the end-user to customize a prompt that you don't need to get it right, you just need to give people the scaffolding and then they can color their internet bubble whatever color they want.
Second, I think that just making all comments just a couple percent more empathetic could be really impactful. It's the sort of systemic nudge that can ripple very far.
joe_the_user|1 year ago
Customer support is one place where I don't want to just "send information". I want to be able to "exert leverage". I want my communication to be able to impel the other actor to take action.
The thing with hn comments is that the guidelines are flexible, even things that violate the guidelines are a kind of communication and play into the dynamics of the site. The "feelings" of hn have impact (for good and ill but still important).
openmajestic|1 year ago
For HN comments, I think you're right. But I think there is still lots of potential there, from tooling to help Dang use his time more effectively to tooling that you can switch on when you are in a bad mood that lets you explore your curiosity but filters out/transforms the subset of comments that you don't have the emotional capacity to deal with well.
The cool thing is that this tech can be easily layered on top of the actual forum (does vertical integration give you anything? certainly crazy expensive) and so the user can be in control of what filters/auto-moderation they embrace. Plus text makes it easy to always drill deeper and see the original as needed.
vunderba|1 year ago
Even an individual extension or system that automatically transforms data into your desired content risks creating an artificially imposed echo chamber.
openmajestic|1 year ago
I think that ship has sailed? Agree that the ramifications of auto-transforming communication is huge, but I think I'm more optimistic. The internet is a cesspool, I think that improving things is pretty likely now that empathetic software is no longer the stuff of dreams.
skybrian|1 year ago
I think proposing a rewrite and letting people decide (and make final edits) could work well, though.
openmajestic|1 year ago
pixl97|1 year ago
[strike]Democracy[/strike]Moderation is non-negotiable.
digging|1 year ago
Typically, yes, it is.