top | item 42798562

Social Trust Score

4 points| bruhmanfiflo | 1 year ago

So I have this idea for what's called a Social Trust score. The idea is that it's a system built around identity verification ("Is this person who they say they are?")and trustworthiness ("Can I trust this person?")using real interactions and connections.

A few examples of use cases for this system are:

Dating, a user can verify their date and know that they are meeting a fairly trustworthy person based on a combination score of personal connections + social connections + professional connections + some type of time factor.

Goods exchange(online or in person), this person scams vs this person is extremely trustworthy.

Roommates, while in NY, I was paired with 3 roommates I'd never met before and placed on one lease. It would have been nice to be able to know if "This person is an exceptional roommate vs this person seemed nice at first but then recked my life, my credit, then my dog."

the list can go on.

I can already hear the "ewwwww" in the room, but here's how the system could be great.

-The system is built on the basis of fostering good behavior. Not just, "you did something bad" but, "you did something wrong, here's how you can fix it", basically providing actionable ways to improve ones behavior and self.

-Every trust score is tied to verified meetings or transactions that requires both parties to confirm the interaction happened.

-The system will have built in "gaming" safeguards. Users can't use the system to harm another user's Trust score, businesses can't harm users vice versa and etc.

-Network based connections categorized and weighted by context(family, personal, social, professional[+time factor]) based on actual connections and meetings. No I just met this person but they can vouch for me. Yes, they're a classmate, I don't know them but have seen how they interact with others.

These are some of my thoughts just trying to hear others thoughts on the subject and the ideas.

37 comments

order

codingdave|1 year ago

I'm assuming you've seen that episode of Black Mirror? https://en.wikipedia.org/wiki/Nosedive_(Black_Mirror)

Because while it is fiction, it also seems really accurate to how such a system would end up going.

fiftyacorn|1 year ago

Its similar to the Orville episode "Majority Rule"

bruhmanfiflo|1 year ago

This seems to be most peoples take on it, not to mention China actually did the thing and made the episode real life. le sigh

johnneville|1 year ago

there was a startup called People in 2015 which wanted to do this but people reacted quite negatively and they shut it down. there's certainly already siloed versions of this such as your banking credit score, or your uber user rating, or your hackernews "karma" etc. and i think people react more positively to these sorts of ratings compared with an overall rating that applies to all aspects of your behavior

context: https://en.wikipedia.org/wiki/Peeple_(app)

https://en.wikipedia.org/wiki/Social_Credit_System

bruhmanfiflo|1 year ago

I've noticed this. People tend to be fine with and even invite these scoring systems in isolation but as a whole people cringe. If it's good to know if an uber driver or door dasher is good at their job, why is it bad to know if someone is a good/bad actor in other parts of their lives?

Terr_|1 year ago

It's not much of an exaggeration to say that humans have been working on making (and breaking) "trust score" systems for thousands of years already, along with a complex web of different kinds of trust.

So while I agree it's an important problem, I am skeptical of any solution which does not refer-to or recognize all the prior art. :p

bruhmanfiflo|1 year ago

Makes since why you would be, especially when looking at "prior art" you can see where the focus was. Basic rating, in order for something like this to work i needs to be fairly robust and thought through. Especially considering that we know how big business tends to grab hold of things like this for gainful usage

rvrs|1 year ago

I believe people (who are not deemed guilty of murder or something and imprisoned for life) have the right to start over, and so people have the right to lie. Having a lifelong number assigned to you is evil. This isn't solving any problems. I'm sorry you had bad roommates, but stop.

bruhmanfiflo|1 year ago

The second chance here is the ability to gain feedback on a bad action and improve it. There is always room for and chances to improve ones self. I would never create a system that placed a negative on someone's entire life. And even those deemed guilty or imprisoned should have a strike at second chances.

gadders|1 year ago

Kind of ironic that you are pitching this with an ID you created 16 hours ago. Clearly there are some interactions you don't want tied back to your real life identity.

bruhmanfiflo|1 year ago

"tied back to your real life identity" my name and google work together my friend...

RiverCrochet|1 year ago

Voluntary identity verification and reputation systems work well for things like Uber or Airbnb because users want to buy and sell services, so there's a straightforward economic justification for the trust.

How do you get people to use it for things that aren't (supposed to be) straightforward economic activity like dates, without being accused of facilitating prostitution?

I know Uber or Airbnb can be misused, but you need at least a car or room to participate on those platforms.

bruhmanfiflo|1 year ago

This system wouldn't enable tinder or cuddle style interactions only a ratings system. I guess it could be possible that people start using it for prostis and start using coded language to rate those types of interactions, something to consider. I wasn't even aware people used airbnb for things like that, uber sure. Edge cases are definitely worth considering

scarface_74|1 year ago

Have you read the various AirBnb related subreddits?

verdverm|1 year ago

> The system will have built in "gaming" safeguards. Users can't use the system to harm another user's Trust score, businesses can't harm users vice versa and etc.

You say this as if it is possible, yet I am unaware of any solution to the problem. This is core to your proposal, can you enlighten us on your vision for this piece?

bruhmanfiflo|1 year ago

a few idea's for safeguards are first tying someone's account to their actual identity meaning one account per person. This could prevent anonymous ratings, harassment, and swarming behavior

Tying ratings to actual meetings or verified transactions preventing leaving multiple reviews from one verified account.

negative feedback can't be hidden or deleted, only improved in future transactions unless deemed harmful by moderators or community.

Anti swarming, "hey friends leave a bad review for this business I hate" wouldn't work as they would need a validated transaction, but this could also have an additional layer of " these socially, professionally, etc connected accounts have all rated this business badly in (some amount of timing factor) this might need to be flagged.

Context protocol. If someone has only ever left negative reviews their "credibility score" becomes lower which in turn lowers their trust score.

Business suddenly gets a flood of positive reviews, this could be flagged as bad behavior by the system for review.

Think less "Yelp-style drive-by ratings" and more "I trust/distrust this business/person and here's why"

jethronethro|1 year ago

How is this materially different from the Social Credit System implemented by the Chinese government?

bruhmanfiflo|1 year ago

foundationally it's similar, where it differs is the SCS is government controlled and focuses on surveillance and control through denying access to travel, jobs, and services, a Trust Score system would be a public self governing system and is simply a tool people can choose to use to make more informed decisions about who they interact with, based purely on real past interactions

hindsightbias|1 year ago

I would find this useful as long as there is a way to filter out all 5 stars.

nejsjsjsbsb|1 year ago

If you do well then Meta buys you and sells my data.

bruhmanfiflo|1 year ago

If I do well no one buys me

bruhmanfiflo|1 year ago

I found this video interview of the founder of the Peeple app. What stands out is how badly she wants it to be a "positive" app, pretty much hiding and doing away with any type of negative ratings or negativity. But if everybody is so great then what's even the point? How do you point to someone's bad behavior to help them improve it? It might be possible that the person might not even know they're doing anything wrong in the first place. Others might be too scared to say something or just don't want to in person. There's also the issue of safety, I tell someone they've wronged me, they feel wronged for me telling them and now my face is punched....tangent video below

https://youtu.be/03DJkpzP-7Y?si=BYMtINM9VN1BXY7f

JojoFatsani|1 year ago

They have this in the PRC already