Google has been doing the same with reCAPTCHA v2 [1]. They are aware of the legal risk of outright blocking users from accessing services, so reCAPTCHA v3 contains no user facing UI, Google merely makes a suggestion in the form of a user score, so the responsibility to delay or block access and the legal liability that comes with it falls on websites.
reCAPTCHA v2 is superseded by v3 because it presents a broader opportunity for Google to collect data, and do so with reduced legal risk.
Since reCAPTCHA v3 scripts must be loaded on every page of a site, you must send Google your browsing history and detailed data about how you interact with sites in order to access basic services on the internet, such as paying your bills, or accessing healthcare services.
It's needless to say that the kind of data that is collected by reCAPTCHA v3 is extremely sensitive. Those requests contain data about your motor skills, health issues, and your interests and desires based on how you interact with content. Everything about you that can be inferred or extracted from a website visit is collected and sent to Google.
If you'll refuse to transmit personal data to Google, websites will hinder or block your access.
There are a lot of sites that are totally unusable on Firefox regardless how much you use ff.
I do all my mobile browsing on FF yet when I try to use some websites I always get this Recaptcha failed error(1) while it works flawlessly on chrome though I never use it often. Try it, maybe it will happen for you too.
Same happens on most sites which show you that "checking your browser" page via cloudflare too.
The web is very unusable unless you're using chrome because of such antics.
I get .7 on my iPhone, I’m guessing that my liberal use of Firefox containers and the cookie auto-delete extension on my desktop will give me a much lower score and cause me to have to jump through extra hoops at websites that implement it, just like the reCaptcha V2 does.
Edit: I also got 0.7 on Firefox with strict content blocking (which is supposed to block fingerprinters), uBlock Origin, and Cookie AutoDelete. I get 0.9 from a container which is logged into Google.
With Firefox fingerprint resisting turned on and with Ublock Origin/UMatrix, I get a score of 0.1. And I'm not even on a VPN; I'm sure on my home network I'd have an even lower score.
To me, it feels like Google's entire strategy behind reCaptcha is to make it harder to protect your privacy. We've basically given up on the idea that there are tasks only humans can do, and to me V3 feels like Google openly saying, "You know how we can prove you're not a robot? Because we literally know exactly who you are." I don't even know if it should be called a captcha -- it feels like it's just identity verification.
I don't think this is an acceptable tradeoff. I know that when reCaptcha shows up on HN there's often a crowd that says, "but how else can we block bots?" I'm gonna draw a personal line in the sand and say that I think protecting privacy is more important than stopping bots. If your website can't stop bots without violating my privacy, then I'm starting to feel like I might be on the bots' side.
I get 0.1 continuously, possibly because I have resist fingerprinting enabled in Firefox. I'm not changing anything to compensate that score; it shows I must be doing something right. If I encounter a reCAPTCHA I will continue to (usually) just leave the site it's on.
Contrary to the results here, using Firefox + uBlock with DNT and tracking protection enabled, I get a score of 0.9. In private browsing mode it's 0.7.
I wonder how many people here are using a VPN or accessing from a non-western country -- I'd bet those are much bigger factors
This looks like a RNG: I got 0.7, 0.9, and 0.1 successively. It can't make up its mind whether I'm almost certainly not a bot (0.9) or almost certainly a bot (0.1)?
I too got 0.1 even though I'm not on a VPN, and have a stock FF installation with just uBlock addon. I think my ISP may have some part in it but still 0.1 score is 100% bot right?
I'm also logged into google and fb which also doesn't affect my score. Only shows how broken their algorithm is :(
edit: just tried it with chrome and my score jumped to 0.9! So definitely not my ISP. It's just my browser that Recaptcha doesn't like. If you put two and two together that's really evil shit, even for Google!
I got 0.7 on FF, 0.3 on Opera and Chrome, all in incognito mode.
Maybe they have just a few values and return it based on AND OR logic of 2-4 variable.
Or maybe they are just playing around trying to gather some stats, for some "Don't be Evil" purpose!
Google is putting a number on us, is honestly some Minority Report level dystopia. Google is already using this to make life hell for anyone who cares about their privacy, we need to do something about this before they finish putting up their iron curtain over the web. Would it be possible to sue website owners for requiring such invasive measures? I'd love to see this ruled as monopoly power and Google broken up but that's probably not very realistic so we would probably do better to make using Google captchas more expensive in court costs alone than just building their own solutions to fight bots.
Seeing what everyone else has posted I'm very suprised that I've received a 0.3 using Chrome on Android. I'm logged in to Google and most of my browsing is via Chrome or Chrome based webview. At least on my phone I've never cleared my cookies or done anything special.
This is total bullshit. My score of 0.1 in firefox shoots up to 0.9 if I change my user agent to ChromeOS. No other changes - same set of ghostery/ad blocker/fingerprinting prevention, etc. What a scam.
I get .9 in Firefox on my MBP with UBlock Origin installed. I wondered if it was because I was logged in to Google, so I tried Incognito and got .7. In a never-before-used container I also get .7.
I get a 0.7 on my computer on Firefox. If I use the same website in Chrome (which is signed into a Google account) I get a 0.9. I guess it's a [0,1] scale?
So, I still have to whitelist Google in uMatrix and allow cookies for this to work. Even after doing so, I get a 0.1. I reloaded the page to check for variation as some other users mentioned but get the same score each time. I guess Google is saying I shouldn't be allowed to use the internet.
What is most odd is I get 0.7 on iOS Safari which I use for 100% of my purposeful mobile browsing, but I get .9 on iOS Chrome, which is only used when I accidentally click on links from gmail (so very, very rarely).
Stock Qutebrowser 0.7, FF w/ all the usual extensions (ublock origin) 0.7. Don't know if it matters but I'm rolling Arch. Just adding another point of data for those curious.
interesting my score is 0.9 if I allowed google to track me using cookies, if I block the cookies it goes to 0.7 and if I enable content blocking in Firefox it drops to 0.1
With desktop Chrome I get a 0.3. My browser sends Do Not Track, has PrivacyBadger extension, and has that useless google-profile-in-the-browser feature disabled.
There are government services, such as the USPTO, that rely on Google reCAPTCHA. The new reCAPTCHA has made it difficult for me to access documents, and sometimes they think that I'm a bot and thus deny me access entirely.
Does the government realize the consequences of this? Both that it pushes users to use Chromium-based browsers, and that they're helping to solidify a company that already has a near monopoly in the browser space?
Further, this quote is very creepy:
> To make this risk-score system work accurately, website administrators are supposed to embed reCaptcha v3 code on all of the pages of their website, not just on forms or log-in pages.
With AMP, Google Ads, and reCAPTCHA, Google now has access to pretty much everything that people do on the web.
The other tradeoff is you're giving Google an extraordinary amount of power to decide who is allowed and not allowed on your website with no transparency on how this decision was made. Not sure what company is willing to blindly trust Google with that power.
I'm torn on this. reCAPTCHA v2 (mostly useless[0]) and v3 function largely on browser fingerprinting plus a few other heuristics (e.g., whether or not you have a Google cookie). Any meaningful privacy measures to resist fingerprinting end up with a low reCAPTCHA score. I personally run into a wall on most sites using it.
That said, it's one of the most effective means of combatting automated spam and credential stuffing attacks. In a recent implementation I did, having 2FA active for your account bypasses the captcha requirement, but the vast majority of users are still too non-technical to use 2FA and are subject to the frustrations of reCAPTCHA.
I hate the v3 reCAPTCHA. On FF, I usually KNOW I am answering correctly and it says I failed. I always have to go through it multiple times. It's maddening. It often leaves me second guessing myself... is that sliver of car counted? is a crossing signal a street light? What about those streetlights way off in the distance, do I select those two in addition to the ones front and center? That RV looks sort of like a bus, should I select that too?
> It’s great for security—but not so great for your privacy.
For individual users, security and privacy frequently go hand-in-hand. But for site operators, user privacy makes security a lot harder. The more you know about a user, the easier it is to figure out if they're an adversary.
Wouldn't reCaptcha V3 also make things much more difficult for Google competitors, assuming that site owners place it on every page? I'm guessing it will block any sort of scraper (since scraper access patterns don't look human) with some sort of whitelist for Google's scrapers.
I guess another question is why we really need captchas. What are we trying to protect against that can't be accomplished with rate limits, voting systems, or other ways to regulate meaningful use of a website?
Ultimately why does it matter if the user is a human or bot, as long as they are being a valuable user? What's wrong if a bot buys some of your inventory, pays for it and everything? What's wrong if an NLP bot responds to discussion threads with scientific facts and citations?
So this is probably a bit off topic, but why don't more site owners just create their own unique anti-spam system? In my opinion, if they were simpler, yet all unique, there would be less bots that could mass spam and privacy would be improved.
Even something as simple as a question: "How many legs does a spider have?" ____
And then cycle through different types of free form questions of things that most people should know. Perhaps block the IP after {n} failed attempts for an hour.
Disabling browser API that adblockers/privacy protectors use. Fingerprinting users with adds at stackowerflow. Now collect information how user navigate webpage. This is a scary trend.
And I am not speaking here about how Android and Android apps (which is allowed by Google) track users.
https://hcaptcha.com/ seems to be a viable alternative. If you are a developer, please consider using something other than reCaptcha. Not only is it annoying, but a privacy nightmare as well.
Google is trying to kill the competition by purposely introducing weird bugs here and there, taking advantage of the fact that they own the most visited sites on the web.
I'm... actually struggling to see what this dark side is. The data is collected under a non-reuse agreement. It's specifically there to make a good captcha. There are other captcha vendors, and they don't make that promise (and I can think of at least one who admits they collect and resell data via captcha).
So the downside here is that no one has a credible way to compete with Google? Maybe because their Google cookie actually is a pretty good indicator of humanity?
That's nonsense. Tons of people do. There's LOADS of great research on captcha that isn't implemented by any vendor. The roadblock is that NO ONE WANTS TO, because it's a thankless, unprofitable task that puts you dead in the crosshairs of a ton of very organized people who will devote huge resources to circumventing or breaking your offering.
"A land grab," sure. Of a nuclear wasteland covered in small arms battles.
Apart from the privacy nightmare, couldn't this also result in discrimination?
> For instance, if a user with a high risk score attempts to log in, the website can set rules to ask them to enter additional verification information through two-factor authentication.
Seems to me, this could easily flag genuine users who access the site through a non-standard flow - e.g. because they use assistive technologies. In the worst case, this could result in impaired users being forced to jump through additional hoops - or being blocked completely.
Smells a lot like using Google's virtual monopoly on bot detection as a way to push users into using one of their other products. Likely not a wise idea when the government is itching for an excuse to bring an antitrust case against you.
Google's captcha system is overkill for most websites. If I want to filter out bad actors (on a simple straight-forward site), there are other more simpler and easier to solve captcha systems out there. They might not have the rigour of Google's system, but they do the job, and well.
I would however use Google's system if the site is massive and there is the possibility that someone is using a script or some program to algorithmically bypass the (simple) captcha, and register accounts en-masse and trying to create a psyop[0], or disinformation campaign, or even a sockpuppet army.
Stupid question: why do companies care so much about bots to the point of degrading the customer experience significantly? I can understand for things like public forums. But like why would an ecommerce website ever put a captcha between you and your order (or a news website)?
[+] [-] dessant|6 years ago|reply
reCAPTCHA v2 is superseded by v3 because it presents a broader opportunity for Google to collect data, and do so with reduced legal risk.
Since reCAPTCHA v3 scripts must be loaded on every page of a site, you must send Google your browsing history and detailed data about how you interact with sites in order to access basic services on the internet, such as paying your bills, or accessing healthcare services.
It's needless to say that the kind of data that is collected by reCAPTCHA v3 is extremely sensitive. Those requests contain data about your motor skills, health issues, and your interests and desires based on how you interact with content. Everything about you that can be inferred or extracted from a website visit is collected and sent to Google.
If you'll refuse to transmit personal data to Google, websites will hinder or block your access.
[1] https://github.com/w3c/apa/issues/25
[+] [-] superasn|6 years ago|reply
I do all my mobile browsing on FF yet when I try to use some websites I always get this Recaptcha failed error(1) while it works flawlessly on chrome though I never use it often. Try it, maybe it will happen for you too.
Same happens on most sites which show you that "checking your browser" page via cloudflare too.
The web is very unusable unless you're using chrome because of such antics.
(1) https://cdn3.imggmi.com/uploads/2019/6/27/0dd96b25707ce6e236...
[+] [-] cbsks|6 years ago|reply
I get .7 on my iPhone, I’m guessing that my liberal use of Firefox containers and the cookie auto-delete extension on my desktop will give me a much lower score and cause me to have to jump through extra hoops at websites that implement it, just like the reCaptcha V2 does.
Edit: I also got 0.7 on Firefox with strict content blocking (which is supposed to block fingerprinters), uBlock Origin, and Cookie AutoDelete. I get 0.9 from a container which is logged into Google.
[+] [-] danShumway|6 years ago|reply
To me, it feels like Google's entire strategy behind reCaptcha is to make it harder to protect your privacy. We've basically given up on the idea that there are tasks only humans can do, and to me V3 feels like Google openly saying, "You know how we can prove you're not a robot? Because we literally know exactly who you are." I don't even know if it should be called a captcha -- it feels like it's just identity verification.
I don't think this is an acceptable tradeoff. I know that when reCaptcha shows up on HN there's often a crowd that says, "but how else can we block bots?" I'm gonna draw a personal line in the sand and say that I think protecting privacy is more important than stopping bots. If your website can't stop bots without violating my privacy, then I'm starting to feel like I might be on the bots' side.
[+] [-] ronjouch|6 years ago|reply
Using Chrome, even incognito and with uBlock I get 0.7
(╯°□°)╯︵ ┻━┻. F you, Google, this is blatant bullying, technically unjustifyable abuse of your stranglehold over the whole web platform.
[+] [-] llao|6 years ago|reply
[+] [-] Loulybob|6 years ago|reply
This is essentially going to let Google gatekeep the web if you aren't using their services.
[+] [-] sleavey|6 years ago|reply
[+] [-] shawnz|6 years ago|reply
I wonder how many people here are using a VPN or accessing from a non-western country -- I'd bet those are much bigger factors
[+] [-] eswat|6 years ago|reply
FF incognito window not logged into Google account: 0.7
FF incognito window not logged into Google account through VPN: 0.3
FYI I have uBlock, pi-hole and a bunch of privacy widgets enabled
[+] [-] lucb1e|6 years ago|reply
[+] [-] superasn|6 years ago|reply
I'm also logged into google and fb which also doesn't affect my score. Only shows how broken their algorithm is :(
edit: just tried it with chrome and my score jumped to 0.9! So definitely not my ISP. It's just my browser that Recaptcha doesn't like. If you put two and two together that's really evil shit, even for Google!
[+] [-] KumarAseem|6 years ago|reply
[+] [-] archy_|6 years ago|reply
[+] [-] Grue3|6 years ago|reply
Almost unused Chrome installation, also without addons: 0.7
[+] [-] qqii|6 years ago|reply
[+] [-] nprateem|6 years ago|reply
[+] [-] del82|6 years ago|reply
Privacy Badger and ABP on my work (less-locked-down) Mac.
[+] [-] xahrepap|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] ixwt|6 years ago|reply
[+] [-] zcid|6 years ago|reply
[+] [-] bluetidepro|6 years ago|reply
[+] [-] GordonS|6 years ago|reply
Brave isn't particularly "unusual", and is even based on Chromium - surely this is Google blatantly punishing non-Chrome users?
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] fybe|6 years ago|reply
I get a 0.7 on Chrome with no account logged in and uBlock Origin installed.
Same browser, same plugin but incognito it's 0.1.
Papa google needs my data to trust me. Makes complete sense but still interesting that you can affect your score by giving in.
[+] [-] jermaustin1|6 years ago|reply
[+] [-] zhte415|6 years ago|reply
> error-codes": ["score-threshold-not-met"]
Not sure if happy or not happy with that. I will conclude happy enough.
Linux, on VPN, Firefox. Not logged into any Google services. Cleared caches (still same IP), no difference.
[+] [-] minieggs|6 years ago|reply
[+] [-] jedberg|6 years ago|reply
Chrome: .9
Safari: .7
Firefox: .1
I have adblock running on all three, and I use containers on Firefox.
[+] [-] Zekio|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] ilikehurdles|6 years ago|reply
[+] [-] seieste|6 years ago|reply
Does the government realize the consequences of this? Both that it pushes users to use Chromium-based browsers, and that they're helping to solidify a company that already has a near monopoly in the browser space?
Further, this quote is very creepy:
> To make this risk-score system work accurately, website administrators are supposed to embed reCaptcha v3 code on all of the pages of their website, not just on forms or log-in pages.
With AMP, Google Ads, and reCAPTCHA, Google now has access to pretty much everything that people do on the web.
[+] [-] cracker_jacks|6 years ago|reply
[+] [-] rbritton|6 years ago|reply
That said, it's one of the most effective means of combatting automated spam and credential stuffing attacks. In a recent implementation I did, having 2FA active for your account bypasses the captcha requirement, but the vast majority of users are still too non-technical to use 2FA and are subject to the frustrations of reCAPTCHA.
[0]: https://github.com/dessant/buster
[+] [-] jcomis|6 years ago|reply
[+] [-] mrosett|6 years ago|reply
> It’s great for security—but not so great for your privacy.
For individual users, security and privacy frequently go hand-in-hand. But for site operators, user privacy makes security a lot harder. The more you know about a user, the easier it is to figure out if they're an adversary.
[+] [-] writeslowly|6 years ago|reply
[+] [-] dheera|6 years ago|reply
Ultimately why does it matter if the user is a human or bot, as long as they are being a valuable user? What's wrong if a bot buys some of your inventory, pays for it and everything? What's wrong if an NLP bot responds to discussion threads with scientific facts and citations?
[+] [-] LinuxBender|6 years ago|reply
Even something as simple as a question: "How many legs does a spider have?" ____
And then cycle through different types of free form questions of things that most people should know. Perhaps block the IP after {n} failed attempts for an hour.
[+] [-] vasili111|6 years ago|reply
And I am not speaking here about how Android and Android apps (which is allowed by Google) track users.
[+] [-] jk2faster|6 years ago|reply
[+] [-] nkkollaw|6 years ago|reply
They've been doing it for a while now, Tech Altar even had a video about it the other day: https://www.youtube.com/watch?v=ELCq63652ig
Along with the censorship and privacy issues, I guess it's time for them to change their payoff, "don't be evil".
[+] [-] KirinDave|6 years ago|reply
So the downside here is that no one has a credible way to compete with Google? Maybe because their Google cookie actually is a pretty good indicator of humanity?
That's nonsense. Tons of people do. There's LOADS of great research on captcha that isn't implemented by any vendor. The roadblock is that NO ONE WANTS TO, because it's a thankless, unprofitable task that puts you dead in the crosshairs of a ton of very organized people who will devote huge resources to circumventing or breaking your offering.
"A land grab," sure. Of a nuclear wasteland covered in small arms battles.
[+] [-] rmolin88|6 years ago|reply
This is complete and utter bullying. Bullying on user privacy, bullying on Firefox.
Somebody please tell me where to go?
[+] [-] xg15|6 years ago|reply
> For instance, if a user with a high risk score attempts to log in, the website can set rules to ask them to enter additional verification information through two-factor authentication.
Seems to me, this could easily flag genuine users who access the site through a non-standard flow - e.g. because they use assistive technologies. In the worst case, this could result in impaired users being forced to jump through additional hoops - or being blocked completely.
[+] [-] Mountain_Skies|6 years ago|reply
[+] [-] octosphere|6 years ago|reply
I would however use Google's system if the site is massive and there is the possibility that someone is using a script or some program to algorithmically bypass the (simple) captcha, and register accounts en-masse and trying to create a psyop[0], or disinformation campaign, or even a sockpuppet army.
[0] https://en.wikipedia.org/wiki/Psychological_Operations_(Unit...
[+] [-] cm2187|6 years ago|reply