(no title)
not_math | 3 years ago
There are a lot of articles and videos showing that sometimes watching just one video will suggest a ton of videos related to it, no matter if you are not interested. Machine learning and deep learning is not perfect, and sometimes the goal of the companies is not clear and may not align with your goals.
Sure, your experience will vary on Reddit, Facebook, Instagram or TikTok based on the people you follow, that's the goal of hyper-personalized feed. But you still get a trend, a social effect of the network.
For example, on Youtube, you need "clickbait" thumbnails. So even Tom Scott, who 's content is educative and entertaining, needs to follow the "trend" of Youtube to get views.
But I see these comments every time someone is blaming the weird content they are seeing on their feed: "Oh me I only see nice stuff, stop watching weird stuff". I think we can have a deeper conversation than that.
falcolas|3 years ago
> TikTok livestreams are a bizarre experience that doesn't get fully conveyed by this article.
> It's a wild show that you don't ask for, it gets thrust on you every once in a while when scrolling.
However the reality is that what you see on TikTok really is a direct reflection on you. It's not an accusation, it's just an acknowledgement of the truth.
TikTok's algorithms are scarily-effective (dramatically more effective at tailoring than all of your other examples), and thus what you see is indeed a direct reflection of what you watch. Any single video (ads excepted), or even trend, just doesn't appear globally on TikTok.
mattnewton|3 years ago
It's a direct reflection on what tiktok's algorithms think they know about you so far, modulo what they think they know about the contents of the videos they are showing. They have a good recommendation engine, sure, but it works on average over large populations through the limited funnel of video interactions, and their video understanding and inventory is similarly limited. This is even before considering that they clearly add some kind of extra exploratory weight to new content they don't have a lot of data about from you.
Their machine learning is basically fancy statistics on watch data, not a peering-into-your-soul Oracle, it's very possible it gets some people's preferences very wrong and is still profitable for larger population segments.
aikendrum|3 years ago
Shouldn't have to point this out - but I do not search for, like or watch anything remotely similar on Tiktok.
aamoscodes|3 years ago
johnmaguire|3 years ago
I really do not think this is true, and is what the parent is getting at. Yes, the videos shown are a reflection of _past videos_ shown to you, and your reactions to them.
That does not mean that they are a _direct_ reflection on you, or an acknowledgement of the truth. The first videos shown to you, or a random stray video with off-content, whether you like them or not, can have a strong biasing effect.
yieldcrv|3 years ago
Simply looking at the comments on a video you disagree with, just to see if others are appalled or brainwashed is something that the algorithm will interpret as deep interest in that kind of content. I have to go find the “not interested” button. Its sad because all those other people are really stuck in that rabbit hole.
KineticLensman|3 years ago
You can control the feed by tapping on a clip you don't like and selecting 'Not interested'. Less successfully by immediately swiping to the next clip. In this way I have got rid of live streams, cats and girls doing dance or PoV trends.
But if you do in fact have a quick peak at the live streams, cats and girls doing dance or PoV trends, TikTok will keep showing them.
throwaway290|3 years ago
If this was the whole truth we could as easily excuse facial recognition algorithms failing non-white people by simply saying that non-white people are more difficult to identify.
The algorithm is a human-made thing and subject to conscious intents or subconscious biases of its makers.
zippergz|3 years ago
aamoscodes|3 years ago
cookiecaper|3 years ago
unknown|3 years ago
[deleted]
derefr|3 years ago
Nah, it's not about intent, but it is about profiling. They're saying that e.g. gullible-seeming people will be algorithmically matched with videos trying to con them out of something, while non-gullible people won't be. People who watch more videos by creators with religious values will eventually be recommended religious content; while people who don't do that, won't. Etc.
Think about it less like users being matched with things they'll appreciate; and more like creators being matched with the audience most receptive to their message.
> There are a lot of articles and videos showing that sometimes watching just one video will suggest a ton of videos related to it, no matter if you are not interested.
This isn't a failure of ML. They've got the algorithm doing exactly what they want it to do. It just isn't serving you.
TikTok is a two-sided market, where the supply is "engaged eyeballs" and the demand is from advertisers with ads to show them (where a regular video producer is just an advertiser who provides enough retention value to the platform with their "ads" that they get paid rather than paying per impression.)
TikTok's algorithm isn't trying to match you with the videos you'll most like; rather, it's trying to optimize the amount of money ByteDance extracts out of its advertisers by optimizing for three things:
1. keeping the eyeballs engaged, by showing them videos which are predicted to increase the particular user's session duration in the app;
2. showing the "engaged eyeballs" the most profitable ads, under the proviso that any given advertiser can filter for eyeballs with specific demographics/interests;
3. (here's the clever bit) — nudging the eyeballs toward videos that will allow them to plausibly say that a given user has a given high-CPM interest, and thus now show them the high-CPM ads.
The third factor is what makes the "one video causes your recommendation feed to completely change" thing.
An very close analogy would be to dating (another imbalanced-demand two-sided market where demand is a passive judgement while supply is an active offer.)
Picture person A walking into a nightclub, looking for a date, but not actively talking to anyone. They sit there, and wait for other people to come up and talk to them. The people that come to person A might be somewhat random at first; but, as the people in the club notice a pattern in who's doing best talking person A up, the supply-side will self-select — they're profiling person A, and "recommending" themselves based on said profiling.
But then, at some point, imagine person A quietly mentioning to one of these strangers "I think I might like [niche interest]." And this news spreading throughout the club.
Now, if there's anyone who likes [niche interest] in the club — suddenly, they think they have a chance. And if having [niche interest] is rare, maybe there are a bunch of unsatisfied single people with [niche interest] who've been desperately waiting for someone like person A to show up. So now there's suddenly a stampede of people, all with [niche interest], trying to get person A's attention. Willing to pay money to get person A's attention, even. So much that the club manager (who happens to be easily bribed) is willing to cordon off the area around person A and set up a queue of all these interested people, so that the "rabble" who aren't so intensely interested (and so aren't willing to pay a bribe), won't even get a word in edgewise any more.
That's TikTok. You're person A. The advertisers are the desperate people in the club. And a single clicked video can be the whisper of acknowledgement of a niche interest they were hoping for.
collegeburner|3 years ago