Google can and should customize search results based on location, and it's not just about "local articles" as the article suggests. If enough people from the same general location click on certain results more frequently, then those results should rank higher for others who search from that same general location.
We use Google because the results are useful, not because they are "unbiased". Ranking implies some sort of "bias" and is what makes search results generally useful. We don't want a search engine that does nothing clever and just spits back unranked results. Otherwise, we would be inundated with results containing credit card scams, porn, Bitcoin scams, Viagra ads, etc, when we search for... pretty much anything.
In privacy (incognito and not logged in) mode, all of the above still applies. What would NOT apply is something like: You are a vegetarian and suddenly all of your restaurant searches rank vegetarian restaurants higher in results while in privacy mode. Unless, of course, for some reason people in your general location happen to mostly eat vegetarian.
In any case, if people don't like it, stop using Google and go use some other search engine; there is absolutely nothing holding you back. More times than not, I think people will switch back to Google because they find the results more useful, even in privacy mode.
> I think people will switch back to Google because they find the results more useful, even in privacy mode.
I now use duckduckgo as default search engine and my experience is mixed.
The problem with google is that sometime you search for something new and then you see the bubble very clearly, which applies non only to search but also to youtube (maybe even more).
The problem with duckduckgo is that you are searching for something specific or something you saw months ago and don't remember well then google's index and tracking can be useful.
This is all well and good for information retrieval, but people are making decisions based on information from search results.
In your vegetarian example, what if 51% of people were vegetarian in an area, and the general population was making decisions off these "localized" search results. We would likely expect that this would influence the minority to the tastes of the majority.
This might be fine for something like vegetarianism, but what about other topics? Should your search results be more racist because you live around a lot of racists? This is best case.
I have tangentially worked with groups that specifically utilize this to provide public opinion sway and consumer capture for their clients.
A contrived example to clarify further: Let's say that there is a link, foo.gov/taxes_in_retirement. Locations with a high concentration of retirees might click on the link more frequently compared to other locations. In privacy mode, from this location, a search for "taxes" might rank that link higher in results based on this activity (even though the search didn't contain the word "retirement"). This shows how a link might be ranked differently depending on search location even if the link itself is not inherently location-specific (as oppressed to, say, a local-news link).
Also, search engines can play with inconsistent ranking of results to see how click-throughs might be affected. For example, if moving a link from first to third in the result list has no effect (people continue clicking on the same link even though it's now third instead of first), then it's a pretty strong signal that the link should continue to be ranked first in future results. This experimentation of search results is even more important the more uncommon a search is because there is less confidence in the current ranking until there is more activity to base the ranking on.
Just as stores shift around product placement (front of the store, back of the store, etc), a search engine is free to shift around search results. Keep in mind that product producers might pay for better in-store product placement too, just as customers pay search engines for ad placement in search results.
You can't just handwave "useful" as an explanation for why location-based SERP discrimination is desirable.
It is "useful" for me to be on the phone with someone in Cleveland and describe how to find something on the Web, expecting that they can follow a similar set of steps at a similar time and get a similar result.
A (sort-of-)deterministic Web can be good and useful. It is a very strong statement of preference and exercise of power to declare that "useful" results must be meaningfully different based on the characteristics of the individual searching.
For whom is that exercise of power most beneficial? I would argue that a rapidly shifting, slippery, personally-dependent presentation of the world's information is extremely useful as a tool of control, but gives only occasional and relatively marginal benefit to individual searchers.
The 2016 US election is a big case in point. Personalizing information delivery, when coupled with asymmetric processing power and data availability, lets you have situations where an atomized polity winds up seeing what suits each individual, but with a radically degraded ability to form collective truths or consensus.
The definition of "useful" is an exercise of power.
> Google can and should customize search results based on location
I feel more and more these ideas of optimizing for 95% of the use cases give good result on paper but shitty lives for the 5% left.
I understand the good intentions behind that calculation, because making life easier for a huge majority of people should be a good thing.
But for instance boosting local results is one of the way you’ll make people often searching for foreign information miserable. Searching for remote places will most of time be met by random local businesses first. Web based international content will be outranked by local content, and your local newspaper bitching about heat waves when it’s just summer will outrank by far rock bands and manga titles.
Sometimes that’s the wanted behaviors, but for instance currently Google already works with strong preference for localised search, and that’s one of the things that pushed me to DDG.
In a way if Google wasn’t so massively successful I’d root for them to better serve mainstream searches. But in the position they are now I think it’s harder to say they should just care about the vast majority of people. Even 1% of their userbase is an incredibly huge number.
The article runs through the analysis you propose, and (within the limitations of the study) show that Google does apply very similar filter bubbling logged out in incognito mode vs logged in.
In fact in “anonymous” mode, the results are much more similar to the same person’s “logged in” mode than to other randomly chosen people’s logged in or logged out results.
What you're saying makes sense for location dependent results (e.g. searching a map of nearby places). For something like "origin of the universe" results would be very different across the world. What happens when I'm in a location I'm not usually in? Do I get the local results or the results for the place I usually reside?
Better than having to go to Privacy Mode (which can feel a little creepy with the spy icon and even just the name), it would be easy for Google to make a clear On/Off toggle for personalized vs raw results.
For the sake of education and not being evil (or "doing the right thing"), it would be nice to be able to view results from other typical profiles' points of view.
The arguments for personalisation make sense. But the other story is sometimes we need unbiased results. Like when reading about politics. You need the dose of "whats are the facts" rather than "What are the facts for me"
The point of the article was that, even when there is no reason to do so because location and identity are identical, Google STILL gives differently ranked results. Please read more carefully.
And I've been using DDG for a bit now and have found it perfectly useable.
People wanted and used google precisely because it was "unbiased" and "unlocalized". It's why it became so popular.
The only reason for the biased localized results is due to corporate pressure from media and news industries.
Also, you are conflating localized and unbiased results with spam and scam. Nobody is saying google shouldn't remove scams and spam. People are just saying they want unbiased results.
To make this research more interesting I'd like to see:
1) repeated queries from the same user. Do the results stay constant over time or do they change?
2) comparisons to the same experiment run against e.g. Bing or DuckDuckGo.
It seems to me that some variation in results is to be expected because of users hitting different backends which might be at different stages of index rollouts. Similarly, response times of different backends matter. If for example the video results don't come back in time you'll end up not having them in the result set.
Lastly, the insinuation of the article is that "unbiased" search results are clearly preferable. I'm not convinced. I for one like that STD for me is associated with the C++ standard namespace (which I search for all the time) rather than sexually transmitted diseases (which I luckily don't have to care about as much).
> Lastly, the insinuation of the article is that "unbiased" search results are clearly preferable.
the insinuation is that you should know if they are biased or that you should be able to get unbiased result if you so wish.
It also raises suspicions on how much google tracks each user.
From this point of view what would be interesting would be a local study, to see in 100 people all in the same neighbourhood with different browsing habits have different results. this would eliminate the "non-tracking" part of the personalization.
> Lastly, the insinuation of the article is that "unbiased" search results are clearly preferable. I'm not convinced. I for one like that STD for me is associated with the C++ standard namespace (which I search for all the time) rather than sexually transmitted diseases (which I luckily don't have to care about as much).
On the other hand, authors could find better names for their libraries ...
Further, there are different solutions, where the user has full control over the context of their search. For instance by maintaining a fully user-controlled list of keywords that is remembered by a cookie (which can be deleted as well).
It makes total sense for them to personalize search results. If I am searching for Django it's the framework not the musician. When I search for a restaurant name it's the one in Boulder, Co, not a restaurant by the same name on a different continent.
People always adjust their messaging according to who they are talking to. It's kinda weird how it's creeping people out when computers do this.
It's not creeping me out, it's disgusting me. When I want to look for django reinhardt, I can look for "django music" or whatever, if I want the framework "django web framework" should do the trick. Oh, and and when I add a + before I word I want that word to show up, if there are no results with that word, show me no results.
I'm fine with others having the option of personalized search, I'm not fine with me not having it.
2018 was the year I adopted ddg, not because of privacy, but because google result sucks.
Almost every time I search, I don’t get a single result I want on the first page. The first 3 results are sponsored adds, then there is the Danish Wikipedia article (useless), then 3-6 advertisements pretending to be content, and then if I’m lucky something that was relevant 5 years ago.
DDG isn’t much better, but it’s better.
I’m not sure if search engines are really to blame though. With everyone being on Facebook, Medium, Quora, reddit, 4chan and so on, it’s like the web just stopped having content worth visiting.
If it wasn’t because HN gave me interesting content, I’m honestly not sure why I’d ever browse the internet anymore. But maybe I’m just getting grumpy.
You get the Wikipedia page? How fortunate. I rarely see them on the first page now even when I know there is a relevant article with matching title. Need more space for ads and DoubleClick affiliates.
I feel the same because of the YouTube changes, I used to browse videos and end up in unexpected places. That is no longer possible, i get only related videos that are either sponsored, older videos i haven't finished and nothing else. I personally am not grumpy but am actually happy that I spend less time on YouTube. Slowly but surely I spend less and less time online and I see it as a good thing.
While filter bubbles are more pervasive in digital media (where we can segment each user, including with personal information), they’ve also always existed.
The filter bubble is the best thing ever. I use search engines to find things and Google finds them for me. I want it to be super tailored to me and show reputable results.
I remember 2000s era search and looking at Page 2. Now I don't scroll below result five 99% of the time. Thank you, Google.
I have to say, though. US Google is better than any other Google I've used.
I'm hosting my own SearX [0] instance to try and eliminate search bubble and control my search history.
SearX is a metasearch engine that proxies out search requests and randomizes all browser fingerprints to make it difficult for any individual to be tracked via algorithm. I don't know how effective it is of course, but I find I prefer the search results I get out of it vs google, even if the image search interface isn't as flashy.
I put my instance behind https and simple auth to allow me a bit of security while using it outside of my private network.
If you want the privacy shield vs google/bing/etc and don't mind a middleman having your search history, there are public SearX instances as well [1].
My search productivity greatly improved when I switched to self-hosted searx. I tried to advocate this to my network of friends but with little success. I run it in a docker container and it's just so easy to manage and the results are so much better.
Being open source, you're free to fiddle with it anyway you want and I consider it as a sort of condom for your privacy.
"With no filter bubble, one would expect to see very little variation of search result pages — nearly everyone would see the same single set of results."
This is the assumption underlying their research, and it is fundamentally not true.
The study’s result seem to be that users often get unique results. That’s not the same as “personalized”, and it certainly isn’t evidence of “bias” as the spreadprivacy.org-link suggests.
A good faith interpretation would point to google running learning algorithms on their results. That would also seem to be a far better explanation for Google changing parts of the page layout, such as the position of news and video results.
The use of the term “bias” for describing differences search results also trips my conspiracy theory detectors.
This doesn't show evidence of a filter bubble. It shows evidence of different results. The filter bubble is the idea that we are in a bubble, cut off from differing viewpoints.
(additionally, I am highly skeptical of the filter bubble's existence/effects and the book was terrible - full of "mights" and "coulds" and few solid facts.)
I have a theory that filter bubbles are causing intolerance to other peoples views. Ie most of the content we consume are through filter bubbles. In other words that most people consume content that are tailored to them. Thus we have less acceptance of things that are not similar as we are less exposed to different content.
Filter bubble examples:
Search services: Google, Bing
Movies: Netflix
Music: Spotify, Apple Music recommendations
News: Facebook
Social media: Facebook feeds
That's exactly what they're causing because you have no easy way to search outside your bubble and end up thinking that's the status quo in fact dis-informing you by omission.
When they say "bubble", I think of groups of users with fewer differences within the group than outside the group, sort of like the Wall Street Journal study showed. If they're finding variation among users, but not predictably more or less variation between any two of them, then that isn't a "bubble" to me, it's just customization.
Customization is troubling, but less so than bubbling. (Hey now...)
> Most people expect both being logged out and going "incognito" to provide some anonymity. Unfortunately, this is a common misconception as websites use IP addresses and browser fingerprinting to identify people that are logged out or in private browsing mode.
Few sparse consideration: "push to extreme effect" or when you see something that marginally interest you but keep seeing it because of "customized" results substantially invite you to dig deeper and in case of some kind of results push people to the extreme like when you search a thing from a left or right party and in few time you get more and more "lefties" or "rightish" contents.
That's may not influence too much normal, acculturated, adults but may influence young and unacculturated people, thinks for example at modern urban legend like "white sugar is poison", like "chemicals trails" and they "tam-tam effect".
Another point "censor effect": we know well that a search based information access is less detailed than a taxonomy based one, we experience that often when we organize our mails, documents, files, alternating taxonomy and search based UI. When our entire world will relay on search based UI instead of taxonomy who control search may control knowledge. So it will became easily "hide" something, "push" something else etc.
Normally this is not a problem, it start to became a problem when very few search system became so ubiquitous and dominant.
"convergence": tied to the first, think only about feeds vs aggregators. With feeds you search for specific stuff and stay up to date while you tend to ignore thing not interest for you. With aggregators this "soft polarization" effect get somewhat lost substituted by another (potentially driven) "hard polarization" effect. As a result general information became less diverse (any publisher try to be at top in any aggregator result instead of follow their style) and people became more "extreme" in their information interest.
That's have far more implication than mere privacy. And if you add to the sauce the actual communication systems status like Whatsapp, GMail etc...
Purposeful mixing or mild randomization of search results also seems like a decent way to help obscure the ranking algorithms to help thwart reverse engineering.
Well, for starters, to add contextual meaning to the words you searched for, they probably have a pretty good language model in the background.
That model was likely build with sources such as the English Wikipedia, and their archive of a few million books. So the space at the bottom of the page may be getting a little tight by now.
Being able to know what someone is likely looking for, is something that really helps the search engine experience. "I find just what I'm looking for and it is always on the first page!" is the sound of a delighted user of a search engine.
The cost however is discovery, which is to say things you might be interested in but didn't know exist. To enhance discovery you often need a wide band curator that can surface "likely" interesting things without destroying the experience of always finding what you want.
In the world of real goods these sorts of discovery curators are enthusiast publications which might talk about the new things coming down the road, or a restaurant critic that is trying the new restaurants.
Real human search and discovery is a pretty personal thing. And when it goes on all inside your head/environment its pretty acceptable too. People putting their favorite cookbooks in a more prominent place, wearing specific fashions that they like while only really shopping at clothing stores that support that fashion look.
When that information is at a third party, and dissectable by tools, then it gets creepy.
Someone who doesn't "know you"[1] but typically wants to sell you something, can find you and market to you, to help you "discover" something new on their schedule instead of on your schedule. When that knowledge about what you like and don't like, pay attention to and ignore, is weaponized into a tool against you (ostensibly to help you see "great deals" that you might have otherwise missed) whether it is a new job opportunity, fashion choices, the vehicle you drive, or even where you eat lunch. That is where it gets annoying. And when the version of you that you present to the world is quite a bit different than the version of you that only you or your most closest confidant see, and someone outside that circle gets a peek because of your search history and what you have shown interest in? That is an existential threat to 'outing' the real you.
That information is power; The power to influence you, the power to sell to you, the power to expose you, the power to control how you see the world and ultimately control your actions in that world.
If you could imagine a machine that as people used it, it condensed bricks of pure platinum out of the air. It was a side of effect of the machines operation. And now you tell the owner of the machine, you can't sell that platinum, you need to just grind it up and throw it away. Well that isn't going to happen, even if there is a big 'for show' grinding operation taking place up in the lobby of the machine's owner. The owner might say, "I charge you nothing to use my useful machine, I am going to keep some of the platinum it produces to cover expenses.
[1] I'm using the phrase in the colloquial where a "known" person is someone who is both familiar and has been granted a certain level of access to your inner thought processes.
I don't think that search personalization is all bad though. It can be used for both good or evil.
If search results were perfectly consistent, some smaller websites might not get any search traffic at all and most big corporation websites would get all the traffic. It would greatly exacerbate winner-takes-it-all effects and inequality.
Personalization allows for some small websites to start with a niche and slowly grow to become more mainstream.
[+] [-] alopecoid|7 years ago|reply
We use Google because the results are useful, not because they are "unbiased". Ranking implies some sort of "bias" and is what makes search results generally useful. We don't want a search engine that does nothing clever and just spits back unranked results. Otherwise, we would be inundated with results containing credit card scams, porn, Bitcoin scams, Viagra ads, etc, when we search for... pretty much anything.
In privacy (incognito and not logged in) mode, all of the above still applies. What would NOT apply is something like: You are a vegetarian and suddenly all of your restaurant searches rank vegetarian restaurants higher in results while in privacy mode. Unless, of course, for some reason people in your general location happen to mostly eat vegetarian.
In any case, if people don't like it, stop using Google and go use some other search engine; there is absolutely nothing holding you back. More times than not, I think people will switch back to Google because they find the results more useful, even in privacy mode.
[+] [-] kiriakasis|7 years ago|reply
I now use duckduckgo as default search engine and my experience is mixed.
The problem with google is that sometime you search for something new and then you see the bubble very clearly, which applies non only to search but also to youtube (maybe even more).
The problem with duckduckgo is that you are searching for something specific or something you saw months ago and don't remember well then google's index and tracking can be useful.
[+] [-] shanty|7 years ago|reply
In your vegetarian example, what if 51% of people were vegetarian in an area, and the general population was making decisions off these "localized" search results. We would likely expect that this would influence the minority to the tastes of the majority.
This might be fine for something like vegetarianism, but what about other topics? Should your search results be more racist because you live around a lot of racists? This is best case.
I have tangentially worked with groups that specifically utilize this to provide public opinion sway and consumer capture for their clients.
[+] [-] alopecoid|7 years ago|reply
Also, search engines can play with inconsistent ranking of results to see how click-throughs might be affected. For example, if moving a link from first to third in the result list has no effect (people continue clicking on the same link even though it's now third instead of first), then it's a pretty strong signal that the link should continue to be ranked first in future results. This experimentation of search results is even more important the more uncommon a search is because there is less confidence in the current ranking until there is more activity to base the ranking on.
Just as stores shift around product placement (front of the store, back of the store, etc), a search engine is free to shift around search results. Keep in mind that product producers might pay for better in-store product placement too, just as customers pay search engines for ad placement in search results.
[+] [-] rlucas|7 years ago|reply
It is "useful" for me to be on the phone with someone in Cleveland and describe how to find something on the Web, expecting that they can follow a similar set of steps at a similar time and get a similar result.
A (sort-of-)deterministic Web can be good and useful. It is a very strong statement of preference and exercise of power to declare that "useful" results must be meaningfully different based on the characteristics of the individual searching.
For whom is that exercise of power most beneficial? I would argue that a rapidly shifting, slippery, personally-dependent presentation of the world's information is extremely useful as a tool of control, but gives only occasional and relatively marginal benefit to individual searchers.
The 2016 US election is a big case in point. Personalizing information delivery, when coupled with asymmetric processing power and data availability, lets you have situations where an atomized polity winds up seeing what suits each individual, but with a radically degraded ability to form collective truths or consensus.
The definition of "useful" is an exercise of power.
[+] [-] hrktb|7 years ago|reply
I feel more and more these ideas of optimizing for 95% of the use cases give good result on paper but shitty lives for the 5% left.
I understand the good intentions behind that calculation, because making life easier for a huge majority of people should be a good thing.
But for instance boosting local results is one of the way you’ll make people often searching for foreign information miserable. Searching for remote places will most of time be met by random local businesses first. Web based international content will be outranked by local content, and your local newspaper bitching about heat waves when it’s just summer will outrank by far rock bands and manga titles.
Sometimes that’s the wanted behaviors, but for instance currently Google already works with strong preference for localised search, and that’s one of the things that pushed me to DDG.
In a way if Google wasn’t so massively successful I’d root for them to better serve mainstream searches. But in the position they are now I think it’s harder to say they should just care about the vast majority of people. Even 1% of their userbase is an incredibly huge number.
[+] [-] GordonS|7 years ago|reply
But what I would also like is a way to search without using my context, as sometimes I want results that aren't related to my location etc.
[+] [-] hedora|7 years ago|reply
The article runs through the analysis you propose, and (within the limitations of the study) show that Google does apply very similar filter bubbling logged out in incognito mode vs logged in.
In fact in “anonymous” mode, the results are much more similar to the same person’s “logged in” mode than to other randomly chosen people’s logged in or logged out results.
[+] [-] kgwxd|7 years ago|reply
[+] [-] throwawaylolx|7 years ago|reply
Should there not be an opt-out option though?
[+] [-] malloryerik|7 years ago|reply
For the sake of education and not being evil (or "doing the right thing"), it would be nice to be able to view results from other typical profiles' points of view.
[+] [-] thecleaner|7 years ago|reply
[+] [-] w323898|7 years ago|reply
And I've been using DDG for a bit now and have found it perfectly useable.
[+] [-] iscro|7 years ago|reply
The only reason for the biased localized results is due to corporate pressure from media and news industries.
Also, you are conflating localized and unbiased results with spam and scam. Nobody is saying google shouldn't remove scams and spam. People are just saying they want unbiased results.
[+] [-] nofunsir|7 years ago|reply
[+] [-] BuschnicK|7 years ago|reply
1) repeated queries from the same user. Do the results stay constant over time or do they change?
2) comparisons to the same experiment run against e.g. Bing or DuckDuckGo.
It seems to me that some variation in results is to be expected because of users hitting different backends which might be at different stages of index rollouts. Similarly, response times of different backends matter. If for example the video results don't come back in time you'll end up not having them in the result set.
Lastly, the insinuation of the article is that "unbiased" search results are clearly preferable. I'm not convinced. I for one like that STD for me is associated with the C++ standard namespace (which I search for all the time) rather than sexually transmitted diseases (which I luckily don't have to care about as much).
[+] [-] kiriakasis|7 years ago|reply
the insinuation is that you should know if they are biased or that you should be able to get unbiased result if you so wish.
It also raises suspicions on how much google tracks each user.
From this point of view what would be interesting would be a local study, to see in 100 people all in the same neighbourhood with different browsing habits have different results. this would eliminate the "non-tracking" part of the personalization.
[+] [-] bduerst|7 years ago|reply
Isn't DDG mostly Bing results these days anyways? (unless you're searching in Russian)
[+] [-] amelius|7 years ago|reply
On the other hand, authors could find better names for their libraries ...
Further, there are different solutions, where the user has full control over the context of their search. For instance by maintaining a fully user-controlled list of keywords that is remembered by a cookie (which can be deleted as well).
[+] [-] tschellenbach|7 years ago|reply
It makes total sense for them to personalize search results. If I am searching for Django it's the framework not the musician. When I search for a restaurant name it's the one in Boulder, Co, not a restaurant by the same name on a different continent.
People always adjust their messaging according to who they are talking to. It's kinda weird how it's creeping people out when computers do this.
[+] [-] Wowfunhappy|7 years ago|reply
I also would likely mean the web framework, but if I suddenly become interested in Django music one day, I don't want Google to make assumptions.
[+] [-] PavlovsCat|7 years ago|reply
I'm fine with others having the option of personalized search, I'm not fine with me not having it.
[+] [-] ucaetano|7 years ago|reply
https://twitter.com/searchliaison/status/1070027261376491520
[+] [-] eksemplar|7 years ago|reply
Almost every time I search, I don’t get a single result I want on the first page. The first 3 results are sponsored adds, then there is the Danish Wikipedia article (useless), then 3-6 advertisements pretending to be content, and then if I’m lucky something that was relevant 5 years ago.
DDG isn’t much better, but it’s better.
I’m not sure if search engines are really to blame though. With everyone being on Facebook, Medium, Quora, reddit, 4chan and so on, it’s like the web just stopped having content worth visiting.
If it wasn’t because HN gave me interesting content, I’m honestly not sure why I’d ever browse the internet anymore. But maybe I’m just getting grumpy.
[+] [-] maxxxxx|7 years ago|reply
[+] [-] kevin_thibedeau|7 years ago|reply
[+] [-] onemoresoop|7 years ago|reply
[+] [-] nemild|7 years ago|reply
In a very different context, I ran an analysis of terrorism coverage in the NY Times to measure what a geographic filter bubble looks like:
How Media Fuels Our Fear of Western Terrorism
https://www.nemil.com/s/part2-terrorism.html
I also ran the same analysis for all the articles over a decade by geography (and compared to population, GDP, etc):
Visualizing 10 years of International Coverage in the NY Times
https://www.nemil.com/s/nytimes-international-coverage.html
While filter bubbles are more pervasive in digital media (where we can segment each user, including with personal information), they’ve also always existed.
[+] [-] scarejunba|7 years ago|reply
I remember 2000s era search and looking at Page 2. Now I don't scroll below result five 99% of the time. Thank you, Google.
I have to say, though. US Google is better than any other Google I've used.
[+] [-] specialist|7 years ago|reply
The problem is the term "filter bubble" conflates personalization, relevance, and recommendations.
I can do without the recommendation engines.
Source: Worked on a recommenders for mid-sized e-commerce site.
[+] [-] DrPhish|7 years ago|reply
SearX is a metasearch engine that proxies out search requests and randomizes all browser fingerprints to make it difficult for any individual to be tracked via algorithm. I don't know how effective it is of course, but I find I prefer the search results I get out of it vs google, even if the image search interface isn't as flashy.
I put my instance behind https and simple auth to allow me a bit of security while using it outside of my private network.
If you want the privacy shield vs google/bing/etc and don't mind a middleman having your search history, there are public SearX instances as well [1].
[0] https://github.com/asciimoo/searx
[1] https://www.searx.me/
[+] [-] decebalus1|7 years ago|reply
Being open source, you're free to fiddle with it anyway you want and I consider it as a sort of condom for your privacy.
[+] [-] moultano|7 years ago|reply
This is the assumption underlying their research, and it is fundamentally not true.
[+] [-] puzzle|7 years ago|reply
[+] [-] matt4077|7 years ago|reply
[+] [-] jccalhoun|7 years ago|reply
(additionally, I am highly skeptical of the filter bubble's existence/effects and the book was terrible - full of "mights" and "coulds" and few solid facts.)
[+] [-] acd|7 years ago|reply
Filter bubble examples: Search services: Google, Bing Movies: Netflix Music: Spotify, Apple Music recommendations News: Facebook Social media: Facebook feeds
[+] [-] onemoresoop|7 years ago|reply
[+] [-] jimmytucson|7 years ago|reply
Customization is troubling, but less so than bubbling. (Hey now...)
[+] [-] cowkingdeluxe|7 years ago|reply
[+] [-] mastazi|7 years ago|reply
Firefox offers integrated protection against browser fingerprinting, but you have to turn it on because it's off by default: https://support.mozilla.org/en-US/kb/firefox-protection-agai...
Fingerprinting protection is also available on Safari on Mac OS X Mojave and iOS 12: https://www.cnet.com/news/new-safari-privacy-features-on-mac...
[+] [-] xte|7 years ago|reply
That's may not influence too much normal, acculturated, adults but may influence young and unacculturated people, thinks for example at modern urban legend like "white sugar is poison", like "chemicals trails" and they "tam-tam effect".
Another point "censor effect": we know well that a search based information access is less detailed than a taxonomy based one, we experience that often when we organize our mails, documents, files, alternating taxonomy and search based UI. When our entire world will relay on search based UI instead of taxonomy who control search may control knowledge. So it will became easily "hide" something, "push" something else etc.
Normally this is not a problem, it start to became a problem when very few search system became so ubiquitous and dominant.
"convergence": tied to the first, think only about feeds vs aggregators. With feeds you search for specific stuff and stay up to date while you tend to ignore thing not interest for you. With aggregators this "soft polarization" effect get somewhat lost substituted by another (potentially driven) "hard polarization" effect. As a result general information became less diverse (any publisher try to be at top in any aggregator result instead of follow their style) and people became more "extreme" in their information interest.
That's have far more implication than mere privacy. And if you add to the sauce the actual communication systems status like Whatsapp, GMail etc...
[+] [-] erva|7 years ago|reply
[+] [-] amelius|7 years ago|reply
[+] [-] IfOnlyYouKnew|7 years ago|reply
That model was likely build with sources such as the English Wikipedia, and their archive of a few million books. So the space at the bottom of the page may be getting a little tight by now.
[+] [-] bduerst|7 years ago|reply
[+] [-] ChuckMcM|7 years ago|reply
The cost however is discovery, which is to say things you might be interested in but didn't know exist. To enhance discovery you often need a wide band curator that can surface "likely" interesting things without destroying the experience of always finding what you want.
In the world of real goods these sorts of discovery curators are enthusiast publications which might talk about the new things coming down the road, or a restaurant critic that is trying the new restaurants.
Real human search and discovery is a pretty personal thing. And when it goes on all inside your head/environment its pretty acceptable too. People putting their favorite cookbooks in a more prominent place, wearing specific fashions that they like while only really shopping at clothing stores that support that fashion look.
When that information is at a third party, and dissectable by tools, then it gets creepy.
Someone who doesn't "know you"[1] but typically wants to sell you something, can find you and market to you, to help you "discover" something new on their schedule instead of on your schedule. When that knowledge about what you like and don't like, pay attention to and ignore, is weaponized into a tool against you (ostensibly to help you see "great deals" that you might have otherwise missed) whether it is a new job opportunity, fashion choices, the vehicle you drive, or even where you eat lunch. That is where it gets annoying. And when the version of you that you present to the world is quite a bit different than the version of you that only you or your most closest confidant see, and someone outside that circle gets a peek because of your search history and what you have shown interest in? That is an existential threat to 'outing' the real you.
That information is power; The power to influence you, the power to sell to you, the power to expose you, the power to control how you see the world and ultimately control your actions in that world.
If you could imagine a machine that as people used it, it condensed bricks of pure platinum out of the air. It was a side of effect of the machines operation. And now you tell the owner of the machine, you can't sell that platinum, you need to just grind it up and throw it away. Well that isn't going to happen, even if there is a big 'for show' grinding operation taking place up in the lobby of the machine's owner. The owner might say, "I charge you nothing to use my useful machine, I am going to keep some of the platinum it produces to cover expenses.
[1] I'm using the phrase in the colloquial where a "known" person is someone who is both familiar and has been granted a certain level of access to your inner thought processes.
[+] [-] jondubois|7 years ago|reply
If search results were perfectly consistent, some smaller websites might not get any search traffic at all and most big corporation websites would get all the traffic. It would greatly exacerbate winner-takes-it-all effects and inequality.
Personalization allows for some small websites to start with a niche and slowly grow to become more mainstream.