(no title)
TokyoKid | 6 years ago
I feel the solution is to aggressively enforce rules against toxic behaviour in order to keep the broader user base from leaving. The users remaining are ones that interact online the same way they would offline; in a more human manner.
The problem is platforms often follow the models of popular platforms before them. Toxic validation gangs, "edge lords" who depend on anonymity, are a measurable target market because previous platforms provide figures on how much engagement they bring in. What I think isn't accounted for is _how much engagement they drive away_, and how much more engagement could be gained by removing them rather than attracting them.
Reading your question, it's slightly unclear whether you want more in-person interaction, or whether you see this as a means to create better digital interaction (or perhaps bridge the gap both ways).
In-person or otherwise, platforms should stop seeing themselves as just turn-key operations--a bunch of code that automates everything. Like a physical venue (hotel, bar, restaurant, etc) has rules and security, a good social network should take on more responsibility for ensuring comfort and safety.
Arranging meetings between online strangers is risky, and those aware of the risks won't show up. People with malicious intentions will. Consider Pokemon Go: they place attractions not in the wilderness but at churches, restaurants or parks. (Even this system is imperfect. I've had unsupervised pre-teens hang around and chat with me as we caught Pokemon in a park. Predators could easily do the same.)
So I guess the short answer is: you should do more than create an app, you should ensure safety. I've seen innumerable meet-up app ideas from new-comer developers, students and entrepreneurs. The successful ones are Meetup.com, "Tweet Ups" and Pokemon Go because they at least ensure public places and groups (rather than one-on-one).
No comments yet.