First time I’ve seen a Terms of Use clause like this one: “Light patterns, like those which might be displayed when using the Services, may result in epileptic seizures in some people. Discontinue use of the Services, if advised by your physician or you experience epileptic symptoms.”
Hrm, I've been using Netlify for static site hosting, and the ability to push from other cloud storage might actually make me want to switch, or at least try this out.
For small static sites, I really feel like it's overkill putting them in a repo or having to drag the "whole site" in everytime from a zip file or folder. Especially for small incremental changes that happen often.
If my understanding is correct, I can have a folder in another cloud storage service (or even on my PC), that when I change a single file there- Fast.IO will notice that change and publish/replicate it to the live site. That's pretty neat.
Yes, that's exactly right! As we were developing the early versions of our homepage that's exactly how we used it. I just synced down a Google Drive folder and hacked away on code and images. Derek could check out the progress anytime he wanted by visiting the URL. Basically, our homepage was always live while we were developing it - great for our MVP.
Now that the homepage codebase is more mature and we have more people working on it, we just created a GitHub repo in that same folder. Now whatever I'm working on is deployed to fastdev.imfast.io and then when I commit to the master branch of the repo, it's synced to fastio.imfast.io (which is connected to our domains Fast.io, Fast.app, Fastio.com etc..). I have a password-protected private dev site, but I'll keep that one to myself for now :D
Hey HN. I created Fast.io as a workflow tool for file hosting. Instead of pushing files to S3, and configuring CloudFront, I wanted more of a "syncing" experience, like Dropbox. The other problem I wanted to solve was log parsing; in other words, I didn't want to log parse. So Fast.io parses the logs for me and sends them right to Google Analytics.
It works as you would expect, and supports files up to 1GB at a price lower than S3/CloudFront.
It's been a long time in the making, I hope you like it!
This is pretty useful! Putting blinders on and working with a project on my laptop and knowing the folder it's in is automagically getting synced with an actual CDN as well as everything being updated in real-time as I make changes??? Sounds pretty sweet. On my last project I wasted an entire damn day dealing with setting up GH-Pages and another project with Coudflair, a regular hosting company and a few tools to automate my workflow. I really like that all of this is out of the box in this product!
How is this different from Netlify? I've used Netlify as a CDN before, which syncs from git repository. I'm trying to understand when I'd use this service.
Hi, thanks for the kind words! I'm Tom, co-founder of Fast.io. We started with a basic Bootstrap template and then heavily customized it (hand-coded). I'm really glad you like it! We've agonized over the messaging, illustrations, and animation quite a bit. I'm sure we still have a lot we can still improve but we're really proud of how it came out.
Actually, I should mention that it was really interesting to build our homepage on our own platform. We started out roughing it in really quickly in a shared Google Drive folder that deployed to a public URL (https://fastdev.imfast.io for example) then, once we started to have more people working on it and required version control, we switched the site over to GitHub and started deploying there. It's a little mind-bending to just hit save and watch a public site update.
Hi, yes, we contract with CloudFlare and Akamai for CDN services at scale, and package them in a SaaS service that provides the rest of the stack. There is no Non-HTML caching limits for us. We do have a 1GB file limit on CloudFlare and a 20GB limit on Akamai. Thanks for your interest!
Hi, not so much CDN reselling as providing a stack that fills out everything under the CDN.
We terminate the origin by syncing from an existing Cloud Storage (Dropbox, Google Drive) or Version Control System (GitHub). We also automate the analytics end by parsing the logs that would normally be in raw format.
The idea is focused at the same use case where you would put content on S3 and pair it with a CDN, our product is just easier to use, faster to setup, and at a comparable or lower price point.
[+] [-] drfuchs|6 years ago|reply
[+] [-] redm|6 years ago|reply
[+] [-] kup0|6 years ago|reply
For small static sites, I really feel like it's overkill putting them in a repo or having to drag the "whole site" in everytime from a zip file or folder. Especially for small incremental changes that happen often.
If my understanding is correct, I can have a folder in another cloud storage service (or even on my PC), that when I change a single file there- Fast.IO will notice that change and publish/replicate it to the live site. That's pretty neat.
[+] [-] SneezyRobot|6 years ago|reply
Now that the homepage codebase is more mature and we have more people working on it, we just created a GitHub repo in that same folder. Now whatever I'm working on is deployed to fastdev.imfast.io and then when I commit to the master branch of the repo, it's synced to fastio.imfast.io (which is connected to our domains Fast.io, Fast.app, Fastio.com etc..). I have a password-protected private dev site, but I'll keep that one to myself for now :D
[+] [-] redm|6 years ago|reply
It works as you would expect, and supports files up to 1GB at a price lower than S3/CloudFront.
It's been a long time in the making, I hope you like it!
[+] [-] telaport|6 years ago|reply
[+] [-] jameslk|6 years ago|reply
[+] [-] tpetry|6 years ago|reply
Am i wrong or is the very slick website based on a theme? I could guess i have seen it before.
[+] [-] redm|6 years ago|reply
To answer your question about a theme, see: https://news.ycombinator.com/item?id=21590473
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] kuczmama|6 years ago|reply
[+] [-] SneezyRobot|6 years ago|reply
Actually, I should mention that it was really interesting to build our homepage on our own platform. We started out roughing it in really quickly in a shared Google Drive folder that deployed to a public URL (https://fastdev.imfast.io for example) then, once we started to have more people working on it and required version control, we switched the site over to GitHub and started deploying there. It's a little mind-bending to just hit save and watch a public site update.
[+] [-] regecks|6 years ago|reply
[+] [-] redm|6 years ago|reply
[+] [-] blitzo|6 years ago|reply
[+] [-] e-moe|6 years ago|reply
[+] [-] Ayesh|6 years ago|reply
Are there any open source tools that you can use to parse a log and perhaps batch-submit to Google analytics?
[+] [-] kardos|6 years ago|reply
[+] [-] redm|6 years ago|reply
We terminate the origin by syncing from an existing Cloud Storage (Dropbox, Google Drive) or Version Control System (GitHub). We also automate the analytics end by parsing the logs that would normally be in raw format.
The idea is focused at the same use case where you would put content on S3 and pair it with a CDN, our product is just easier to use, faster to setup, and at a comparable or lower price point.
Thanks for your interest!
[+] [-] jongbeau|6 years ago|reply
[+] [-] redm|6 years ago|reply
[+] [-] r618|6 years ago|reply
[deleted]