I don't see any other RBI / CBII vendor open sourcing their platform and in the security industry "closed source" can create issues.
But what about business defensibility?
I agree. Open sourcing removes the trade secret aspect that could make a defensible business.
At the same time, a determined hacker would already have my source code. A hacked "free wifi" connection here, a bit of social engineering there, and my so-called "competitive advantage" could be easily removed. Access to GitHub, Gitlab, other accounts would prove no obstacle for someone motivated, and open sourcing is a way to remove the advantage any small group of parties has by keeping it secret.
It might be work stopping the browser 'loading' itself inception style with some kind of blacklisting of loadable urls - I assume you already have something to protect against SSRF type attacks on the platform itself.
Could someone write a few sentences about what it is and how it works, and why it is significant? I see neither this post, the GH repo, nor its website really says much of anything on the subject. I only see info about why it’s being open sourced and how to set it up. If someone were to go to all that trouble, I am surprised they would stop short on just providing basic info.
Sure, BrowserGap is a remote browser isolation product. RBI means accessing the public internet through a browser that runs in the cloud, rather than through a browser that runs on your device. This helps protect you from attacks on the web.
Great work, and so great of you to open-source this - I really hope you keep it that way!
Would be really cool to set this up and set the server up to do some proxy-hopping to make IP tracking more difficult as well, regardless of client device and when roaming.. While I'd be more leaning towards self-hosting, if you set this up as a subscription service, your users will also benefit from sharing the same pool of IPs (though I imagine you'd also face issues with getting flagged/blacklisted/CAPTCHAd a lot through abusers and bots, that will be significant work to polise if you go that route).
Darn... I'm working on almost exactly the same project. The big challenge is getting access to server hardware that is actually meant for webbrowsing. Not only are AWS et al expensive, they primarily offer "webservers" which are optimized for very light not very CPU intensive workloads and needless to say they also don't offer hardware accelerated video decoding.
Hey! Sorry for taking so long to get back to you. I saw your comment and wanted to respond, and I have been very busy today because of this post.
Thanks for sharing your feeling about this.
It looks like you're working on almost exactly the same project, and that the big challenge is getting access to server hardware for webbrowsing, because not only are AWS etc expensive, they primarily offer "webservers" optimized for very light not very CPU intensive workloads, and needless to say they also don't offer hardware accelerated video decoding.
Wow! Sounds like you're doing something interesting. Are you uninterested in collaboration? I was thinking of ways to make the video better, but right now I'm basically just using DIY "MJPEG" over websocket.
As for the server hardware, I was decided to join the Stripe Atlas program at the start of 2017, and from that I was able to get 5K in AWS credits, and then more Google Cloud credits, and I also applied to IBM and Digital Ocean on my own and got credits from them as well.
So, so far I have been able to develop and then demo this (like today) without significant monetary cost.
I also have some tips for you, because resource usage was one of my concerns, but TBH I find Chrome headless actually always uses less CPU than I imagine. It's all about the page that it is rendering. The page determines everything, but Chrome itself is very light. So when I've budgeted for like 1 CPU per user, it's actually possible to get much more than that. And memory is the big thing that Chrome does lightly, it uses barely any RAM even with 100s of users on a machine. I was surprised by that. 100 users running Chrome and only ~ 20 Gb of RAM used.
Also, regarding video, because I'm avoiding expensive video encoding (just sending screenshots) I avoid the CPU load of doing that. I've experimented with doing more processing of the frames, but it just throws the load way off.
I chose to keep it simple and I'm pleased with that. At the same time, I want to explore ways to improve image quality.
Hmm, I'm not sure if people in the western world can use it but in South Korea (with a 1Gbps network) this is basically unusable.
It has too much latency, text doesn't get entered, scrolling doesn't work, etc... Is the experience from US similar? Or is this just b.c. of server latency?
If you're talking about the free/demo version, it's hosted in a US-East GCP instance and the author said elsewhere in the thread that it's under heavy load, i imagine a self-hosted instance would work well.
I just want to let everyone know there's also a version for Asia-pacific (in a HK Google datacenter) that will probably be faster if you're not in a US timezone.
A while ago I wrote something very similar - https://fasterbadger.com - see discussion at https://news.ycombinator.com/item?id=9679464 - based on PhantomJS - didn't put nearly as much work into it though, it's still very much in prototype stage - the goal was different though, not so much for privacy but for browsing JS- and CSS-heavy sites (e.g. most news sites) on legacy devices/phones, especially where bandwidth is at a premium.
I'm looking at your app and I love the long scroll feature. How did you do that? It's so cool how you can scroll down the page natively, and the image updates, that's really incredible.
And I'm reading the initial discussion, and it's ... in 2015! Wow, how did you do this back then! I think thing's are so much easier now with all the features in the protocol.
I am really interested in how you did this and I love the site. It's very cool and I prefer it to my own work in many ways. Would you be a terrible idea for you to contribute to BrowserGap?
About half way through development, I was travelling and buying 4G data sims and I also thought I needed to use it for that (easily use 50Mb just on a news site).
So I made a HTML only version (no images, just stripped back HTML, you can see the work in the various 'appminifier' subdirectories somewhere in the repo). It saved me data, but introduced lots of quirks. At some point I realized it was too difficult, and I was committed to another idea with it, rather than this low bandwidth, so I stopped working on that feature.
Also, I love the Open in new tab? feature you have. This really rocks. It made me so happy to see this work! Thank you so much for sharing with me. :)
I wanted to get a scrolling feature like you have and I couldn't think of a way to make it work. If yow could do that in BG I'd love it!
Liked your product from your previous submission and liked this one as well. I think it can help some people where censorship is present, but not particularly for me.
Interested to hear about professional use cases for this kind of tool. I am working on a cloud-based sandbox browser very similar to the OP here: https://www.sandboxbrowser.com/
My target audience is software developers, QA engineers, and Ops people who want a predictable isolated browser environment for doing various forms of testing / hacking.
Cool idea at first, but on second thought, how is it supposed to mitigate internet threats? Users need to download files, open them with local apps, upload local files. All necessary channels for RCEs and exfiltration are still there. Current malware codebase might get stuck with it, but it's a matter of time and adoption. Other threats like clickjacking, cryptomining, phishing would just work as before.
Good point, let me temporarily switch the default search provider to DuckDuckGo. This will take a while to propagate to the browsers.
A workaround in the meantime is to enter a URL in the box instead.
Edit: Switched to DDG as default search provider for this. Back up at 21:24 PST.
Edit @23:00 PST: I've opened an issue with Google Cloud Support (the system is hosted on GCP even tho it is cloud-agnostic) and I don't expect they will be able to provide a resolution because this is probably the CAPTCHA behaving correctly.
BTW, it could be the AI team getting some more data, who knows? ;)
Somehow tho I think whatever we do is simply a drop in a bucket for them.
Surprisingly usable, considering it's on the other side of the globe. Personally I don't see myself having a use for it, but I'm sure it could find its users.
Two thing I noticed. You have to enter address including "http(s)" to avoid searching it in DDG. And more annoyingly I couldn't select text on web page.
Just tried the demo on your website and it seems to be quite slow and unresponsive. If this a load issue?
Also it gives me a feeling that I'm not in control of what's happening on the screen. Could you please let me know how is this solution better (or more secure) than using remote desktop with disposable VMs?
I'm sorry about that! That's sounds like a really terrible experience you had trying this out today. It must feel really uncomfortable to not be in control of what's happening on screen.
There's a couple of factors that could be playing into this. Primarily it's likely just the application itself. It is more slow, and less responsive than using a regular browser on your device.
The frame-rate is capped very low, the image quality is lower, and there's more lag to each interaction since it involves (at the very least) a WebSocket round trip and a screenshot.
Secondly, you could be affected by geography, which has a very significant effect. If you are close to the primary server (US East, Virginia) you'll have a faster more responsive experience.
In a few minutes I'll have the HK server (Asia Pacific) back up ( I was just resizing it down, it was seeing significantly less use than the US server), and if you're closer to that you can try there.
Also, the free demo has many caps (so as to control costs). I cap the outgoing bandwidth of each user to a very low 3Mbit/s, and I use multiple ways to cap CPU usage, including (in extreme cases) killing the process. All of this means that if the page you are using wants to eat a lot of CPU (happens sometimes) then the app will slow right down for you (to preserve resources for everyone else on the system).
I can say confidently that it is not about the number of users. We had More 100s at peak before and a single browser still felt snappy. So if you're getting slow down I think it is (to summarise), either:
- You are experiencing the app for the first time, it is different to using a normal browser, interactions are slower and more choppy (but page loads should be as fast or faster).
- The page you are browsing is hitting the resource monitoring and being downregulated.
- You are link-wise far from the server (which is often, but not always, related to geography).
If you're interested to give it another try I'm at another time I'm happy to arrange that. Would you be unwilling to leave your email at this form, and I can let you know a quieter time? Also, if you just email me at [email protected] and let me know your approximate location, I can set up a server near you and we can attempt to work out any leg issues still occurring.
Hey HN, thanks for all the love on this, and for helping me think about other use cases for and how to communicate about this product. I really appreciate this!
Also, I noticed I spent a lot of time maintaining the demo instances (resizing). In a real deployment the number of users per machine is pretty much static, but here I've had to deal with scaling and spikes.
It occurred to me today that I could probably put the free demos behind a load balancer (smaller basic machines, and scale them up or down), so that I don't have to manually resize the instance.
I've taken down the two demo sites (free & hk) for now, while I work on the load balancer setup. Should be back up in a couple hours.
I have not worked out how to geographically load balance both of those from a single domain based on which you are closest too, but I want to see if these new smaller instances in load balanced target group pools can scale to take the load like a larger instance.
Just had a quick check, some web page will fail the click event(mouse click won't do anything).
on the other hand I have been using novnc to do similar things for my own testing purposes(running chrome inside chrome remotely), it worked very well for me so far.
Thanks for the report, it's very valuable and thanks for pointing me to noVNC, it looks great!
I'm very serious about usability issues, would it be impossible for you to provide some examples of sites where the click failed and what you clicked on?
I try to get those things fixed ASAP because the experience of using is so important, I think it should be as familiar to a regular browser as possible.
Bleak future flashed in front of my eyes - Google Chrome v100 - browser runs on google's cloud, you are just being streamed view to your Chrome Client™. So secure and you are so out of control.
As I understand it, it's like an extreme sandbox -- a completely separate computer (or a VM) where all the web stuff happens (javascript etc), which just ships pixels to your computer (phone/laptop etc). Ideally the complexity of the client software is low, i.e. not a web browser, and there is strict site isolation (VMs) at the sandbox side to prevent leakage from one site to another. I'm a little vague as to how this implementation works.
So this is nothing to do with a VPN as such, but of course you could host it in the cloud, or run a VPN to a cloud endpoint.
[+] [-] slowenough|6 years ago|reply
Why am I open sourcing this?
I don't see any other RBI / CBII vendor open sourcing their platform and in the security industry "closed source" can create issues.
But what about business defensibility?
I agree. Open sourcing removes the trade secret aspect that could make a defensible business.
At the same time, a determined hacker would already have my source code. A hacked "free wifi" connection here, a bit of social engineering there, and my so-called "competitive advantage" could be easily removed. Access to GitHub, Gitlab, other accounts would prove no obstacle for someone motivated, and open sourcing is a way to remove the advantage any small group of parties has by keeping it secret.
How do I self-host it?
There's instructions on the repository page.
[+] [-] FreeHugs|6 years ago|reply
[+] [-] krzkaczor|6 years ago|reply
[+] [-] westi|6 years ago|reply
It might be work stopping the browser 'loading' itself inception style with some kind of blacklisting of loadable urls - I assume you already have something to protect against SSRF type attacks on the platform itself.
[+] [-] jusob|6 years ago|reply
[+] [-] mirimir|6 years ago|reply
Nobody sane relies on anything that's closed-source.
Edit: OK, nobody prudent.
[+] [-] dlandis|6 years ago|reply
[+] [-] progval|6 years ago|reply
[+] [-] slowenough|6 years ago|reply
[+] [-] Legogris|6 years ago|reply
Would be really cool to set this up and set the server up to do some proxy-hopping to make IP tracking more difficult as well, regardless of client device and when roaming.. While I'd be more leaning towards self-hosting, if you set this up as a subscription service, your users will also benefit from sharing the same pool of IPs (though I imagine you'd also face issues with getting flagged/blacklisted/CAPTCHAd a lot through abusers and bots, that will be significant work to polise if you go that route).
[+] [-] slowenough|6 years ago|reply
[+] [-] throwaway58235|6 years ago|reply
[+] [-] slowenough|6 years ago|reply
Thanks for sharing your feeling about this.
It looks like you're working on almost exactly the same project, and that the big challenge is getting access to server hardware for webbrowsing, because not only are AWS etc expensive, they primarily offer "webservers" optimized for very light not very CPU intensive workloads, and needless to say they also don't offer hardware accelerated video decoding.
Wow! Sounds like you're doing something interesting. Are you uninterested in collaboration? I was thinking of ways to make the video better, but right now I'm basically just using DIY "MJPEG" over websocket.
As for the server hardware, I was decided to join the Stripe Atlas program at the start of 2017, and from that I was able to get 5K in AWS credits, and then more Google Cloud credits, and I also applied to IBM and Digital Ocean on my own and got credits from them as well.
So, so far I have been able to develop and then demo this (like today) without significant monetary cost.
I also have some tips for you, because resource usage was one of my concerns, but TBH I find Chrome headless actually always uses less CPU than I imagine. It's all about the page that it is rendering. The page determines everything, but Chrome itself is very light. So when I've budgeted for like 1 CPU per user, it's actually possible to get much more than that. And memory is the big thing that Chrome does lightly, it uses barely any RAM even with 100s of users on a machine. I was surprised by that. 100 users running Chrome and only ~ 20 Gb of RAM used.
Also, regarding video, because I'm avoiding expensive video encoding (just sending screenshots) I avoid the CPU load of doing that. I've experimented with doing more processing of the frames, but it just throws the load way off.
I chose to keep it simple and I'm pleased with that. At the same time, I want to explore ways to improve image quality.
[+] [-] pcr910303|6 years ago|reply
It has too much latency, text doesn't get entered, scrolling doesn't work, etc... Is the experience from US similar? Or is this just b.c. of server latency?
[+] [-] judge2020|6 years ago|reply
[+] [-] slowenough|6 years ago|reply
If you want, I can test it. I'll open up an "East Asia" instance and we can see how if it's any better.
[+] [-] slowenough|6 years ago|reply
https://hk.cloudbrowser.xyz
[+] [-] wdrw|6 years ago|reply
[+] [-] slowenough|6 years ago|reply
Thank you so much for sharing this.
I'm looking at your app and I love the long scroll feature. How did you do that? It's so cool how you can scroll down the page natively, and the image updates, that's really incredible.
And I'm reading the initial discussion, and it's ... in 2015! Wow, how did you do this back then! I think thing's are so much easier now with all the features in the protocol.
I am really interested in how you did this and I love the site. It's very cool and I prefer it to my own work in many ways. Would you be a terrible idea for you to contribute to BrowserGap?
About half way through development, I was travelling and buying 4G data sims and I also thought I needed to use it for that (easily use 50Mb just on a news site).
So I made a HTML only version (no images, just stripped back HTML, you can see the work in the various 'appminifier' subdirectories somewhere in the repo). It saved me data, but introduced lots of quirks. At some point I realized it was too difficult, and I was committed to another idea with it, rather than this low bandwidth, so I stopped working on that feature.
Also, I love the Open in new tab? feature you have. This really rocks. It made me so happy to see this work! Thank you so much for sharing with me. :)
I wanted to get a scrolling feature like you have and I couldn't think of a way to make it work. If yow could do that in BG I'd love it!
[+] [-] sdan|6 years ago|reply
Liked your product from your previous submission and liked this one as well. I think it can help some people where censorship is present, but not particularly for me.
[+] [-] kpsychwave|6 years ago|reply
My target audience is software developers, QA engineers, and Ops people who want a predictable isolated browser environment for doing various forms of testing / hacking.
[+] [-] ComodoHacker|6 years ago|reply
Am I missing something?
[+] [-] wila|6 years ago|reply
Hack one instance and get access to hundreds of users browsing the internet.
[+] [-] vhodges|6 years ago|reply
https://ungleich.ch/u/blog/how-to-run-your-browser-in-the-cl... (which will ultimately lead you to https://guacamole.apache.org/)
[+] [-] infinitone|6 years ago|reply
Maybe this is the Google AI team trying to get more people to solv'em? ;-)
[+] [-] slowenough|6 years ago|reply
A workaround in the meantime is to enter a URL in the box instead.
Edit: Switched to DDG as default search provider for this. Back up at 21:24 PST.
Edit @23:00 PST: I've opened an issue with Google Cloud Support (the system is hosted on GCP even tho it is cloud-agnostic) and I don't expect they will be able to provide a resolution because this is probably the CAPTCHA behaving correctly.
BTW, it could be the AI team getting some more data, who knows? ;)
Somehow tho I think whatever we do is simply a drop in a bucket for them.
[+] [-] Hitton|6 years ago|reply
Two thing I noticed. You have to enter address including "http(s)" to avoid searching it in DDG. And more annoyingly I couldn't select text on web page.
[+] [-] rinchik|6 years ago|reply
Also it gives me a feeling that I'm not in control of what's happening on the screen. Could you please let me know how is this solution better (or more secure) than using remote desktop with disposable VMs?
[+] [-] slowenough|6 years ago|reply
Edit: the HK site is back up. https://hk.cloudbrowser.xyz
There's a couple of factors that could be playing into this. Primarily it's likely just the application itself. It is more slow, and less responsive than using a regular browser on your device.
The frame-rate is capped very low, the image quality is lower, and there's more lag to each interaction since it involves (at the very least) a WebSocket round trip and a screenshot.
Secondly, you could be affected by geography, which has a very significant effect. If you are close to the primary server (US East, Virginia) you'll have a faster more responsive experience.
In a few minutes I'll have the HK server (Asia Pacific) back up ( I was just resizing it down, it was seeing significantly less use than the US server), and if you're closer to that you can try there.
Also, the free demo has many caps (so as to control costs). I cap the outgoing bandwidth of each user to a very low 3Mbit/s, and I use multiple ways to cap CPU usage, including (in extreme cases) killing the process. All of this means that if the page you are using wants to eat a lot of CPU (happens sometimes) then the app will slow right down for you (to preserve resources for everyone else on the system).
I can say confidently that it is not about the number of users. We had More 100s at peak before and a single browser still felt snappy. So if you're getting slow down I think it is (to summarise), either:
- You are experiencing the app for the first time, it is different to using a normal browser, interactions are slower and more choppy (but page loads should be as fast or faster).
- The page you are browsing is hitting the resource monitoring and being downregulated.
- You are link-wise far from the server (which is often, but not always, related to geography).
If you're interested to give it another try I'm at another time I'm happy to arrange that. Would you be unwilling to leave your email at this form, and I can let you know a quieter time? Also, if you just email me at [email protected] and let me know your approximate location, I can set up a server near you and we can attempt to work out any leg issues still occurring.
[+] [-] danbmil99|6 years ago|reply
[+] [-] slowenough|6 years ago|reply
First up, have you had any issues with site banning or CAPTCHA?
After I saw your question I wanted to know so I just tried signing into my LinkedIn from https://hk.cloudbrowser.xyz and I'll share my experience.
And first they sent a code to my email because "something seemed suspicious":
https://imgur.com/gallery/2lflmjf
When I put in the code from my email I could sign in and it worked as usual.
I have noticed that everytime I land at https://bloomberg.com I get a CAPTCHA (1 only) and then I could read the site.
I opened a support ticket with BB but they said they don't need to do anything right now.
I felt OK with that. 1 CAPTCHA is not too bad.
[+] [-] slowenough|6 years ago|reply
Also, I noticed I spent a lot of time maintaining the demo instances (resizing). In a real deployment the number of users per machine is pretty much static, but here I've had to deal with scaling and spikes.
It occurred to me today that I could probably put the free demos behind a load balancer (smaller basic machines, and scale them up or down), so that I don't have to manually resize the instance.
I've taken down the two demo sites (free & hk) for now, while I work on the load balancer setup. Should be back up in a couple hours.
[+] [-] slowenough|6 years ago|reply
I moved from a single massive instance to a target pool behind a load balancer with health checks based on if there's available queue.
https://free.cloudbrowser.xyz
https://hk.cloudbrowser.xyz
I have not worked out how to geographically load balance both of those from a single domain based on which you are closest too, but I want to see if these new smaller instances in load balanced target group pools can scale to take the load like a larger instance.
[+] [-] ausjke|6 years ago|reply
on the other hand I have been using novnc to do similar things for my own testing purposes(running chrome inside chrome remotely), it worked very well for me so far.
[+] [-] slowenough|6 years ago|reply
I'm very serious about usability issues, would it be impossible for you to provide some examples of sites where the click failed and what you clicked on?
I try to get those things fixed ASAP because the experience of using is so important, I think it should be as familiar to a regular browser as possible.
[+] [-] AndrewThrowaway|6 years ago|reply
[+] [-] dijit|6 years ago|reply
[+] [-] gotts|6 years ago|reply
how browsergap > self-host on your own machine (at home, or in a VPS, VPC or the public cloud)
is more secure/private then just setting up a VPN on that machine?
[+] [-] angry_octet|6 years ago|reply
So this is nothing to do with a VPN as such, but of course you could host it in the cloud, or run a VPN to a cloud endpoint.
[+] [-] m1sta_|6 years ago|reply