There's a payment form but I never hooked anything up -- it's all free. You can sign up to have a screenshot of any URL emailed to you on a daily basis, or opt out of emails and just have it archived on the site. I've been collecting daily screenshots of ~20 public web sites for the last few months:
This was inspired by a short discussion on Twitter between mperham and patio11 last weekend.[1] The basic app works great right now but I haven't put the billing code in quite yet. If you're interested, sign up for the list[2] and I'll send you an invite (with a discount) when it's ready to go.
Love the simple site and focus. I've been working on a similar project focusing more on visual diffing, but haven't been able to launch yet. Still building the server infrastructure, product focus, and working out the billing.
Along these lines, I've always wanted a repo specific wayback machine that would generate a state of your site for every commit, so you could browse what your site looked like at every point along the way as you built your repo.
So I could browse back to last week and see what my site looked like, or to my 3rd commit and see what my site looked like then.
what we need is some nielsen families that have full videos of their everyday surfing - just whatever they're doing, 24/7, saved for posterity. Imagine seeing how someone used the 'net in 1994 and 2004. It would be an interesting museum, to say the least.
Question. Is the number of urls such a huge burden?
At Theneeds.com we manage about 4k websites (planning to grow to 10k) so the prices of all these services are totally out of our budget, and we ended up with an in-house solution to take screenshots.
I would see myself paying more depending on the total number of screenshots and/or the size of the screenshots, but I really don't get the difference between 1 or 1k urls.
It's not necessarily a huge burden, but if someone wants to run 10K URLs through this thing on a daily basis we'd need to have a chat. As of today it's not ready to scale that big.
Of course, if someone did want to run that many through I'm sure we could talk about volume pricing, given sufficient lead time to scale the app.
Are the screenshots being grabbed by Pagesnap or the website's end? Do I have to include some sort of JS snippet? Could you use this to monitor the competition's landing page?
I think the real benefit of sites like this are accountability and historic research. Having the ability to see what has been omitted or taken down is far more interesting.
Reading some old Steve Yegge posts (e.g. https://sites.google.com/site/steveyegge2/five-essential-pho...) I started wondering about the feasibility of a web service that would automatically change links on articles older than say, 5 years, to the equivalent Web Archive/Google cache/other links.
Which immediately saves the page to the Internet Archive. I haven't had time to look into if the Internet Archive has an API you can issue a URL to and it will fetch the content in the same way as above.
You can do this with a client-side extension, like Resurrect Pages, which presents links to the Google cache (for recent sites under load) and the Internet Archive (for old links that have rotted). https://addons.mozilla.org/en-US/firefox/addon/resurrect-pag...
Very cool and I think the pricing is just right. Why are there 3 start now buttons all together though? Does each correspond to the plan you'd like to buy?
This would solve a large problem I have, if it could be taught how to click on a few well-named #ids my dynamic page has. It's not obvious whether it can do this.
[+] [-] derwiki|12 years ago|reply
https://www.dailysitesnap.com
There's a payment form but I never hooked anything up -- it's all free. You can sign up to have a screenshot of any URL emailed to you on a daily basis, or opt out of emails and just have it archived on the site. I've been collecting daily screenshots of ~20 public web sites for the last few months:
https://www.dailysitesnap.com/public
EDIT: fixed embarrassing typo, good morning everyone!
[+] [-] derwiki|12 years ago|reply
https://news.ycombinator.com/item?id=6612286
But when I turned it into a real service, there was virtually no interest:
https://news.ycombinator.com/item?id=6654544
zrail, I've got some Ruby code that hits Selenium to take these screenshots. Email adam at cameralends if you're interested, you can have it too.
[+] [-] zrail|12 years ago|reply
https://twitter.com/mperham/status/436976217443422208
http://www.pagesnap.io/list-signup
[+] [-] soneca|12 years ago|reply
Just kidding! ;) Smart move and good luck!
[+] [-] mountaineer|12 years ago|reply
[+] [-] askedrelic|12 years ago|reply
Thanks for sharing!
[+] [-] wlll|12 years ago|reply
http://signalvnoise.com/posts/3007-37signalscom-homepage-evo...
[+] [-] timjahn|12 years ago|reply
So I could browse back to last week and see what my site looked like, or to my 3rd commit and see what my site looked like then.
[+] [-] octo_t|12 years ago|reply
[+] [-] thehodge|12 years ago|reply
[+] [-] logicallee|12 years ago|reply
[+] [-] dbarlett|12 years ago|reply
https://github.com/bslatkin/dpxdt
[+] [-] pastylegs|12 years ago|reply
- Some sort of visual timeline would also be great, i.e. the ability to flip through screenshots with a javascript slider or something
- Export to gifs of video might be useful
- The ability to tag information to certain screenshots would be useful for noting changes and milestones (like in Google Analytics)
[+] [-] minouye|12 years ago|reply
http://newsabovethefold.com
It's really fascinating to look at once you've collected a decent number of screenshots to animate through:
http://newsabovethefold.com/animate/1 (CNN.com every hour for the last two months--will load slowly!)
I'd definitely be interested in using this service if I wasn't already paying for Snapito.
[+] [-] carsonreinke|12 years ago|reply
[+] [-] andygcook|12 years ago|reply
At my previous startup I hacked this manually by taking a screen grab with Evernote once a month.
I've heard Alexis Ohanian mention he is thankful for having the foresight to take screen shots or early reddit builds too.
I'm sure a lot of startups would find this useful for capturing the journey of the product and then later nostalgia.
[+] [-] ecesena|12 years ago|reply
At Theneeds.com we manage about 4k websites (planning to grow to 10k) so the prices of all these services are totally out of our budget, and we ended up with an in-house solution to take screenshots.
I would see myself paying more depending on the total number of screenshots and/or the size of the screenshots, but I really don't get the difference between 1 or 1k urls.
[+] [-] zrail|12 years ago|reply
Of course, if someone did want to run that many through I'm sure we could talk about volume pricing, given sufficient lead time to scale the app.
[+] [-] rschmitty|12 years ago|reply
[+] [-] zrail|12 years ago|reply
[+] [-] 650REDHAIR|12 years ago|reply
[+] [-] calbear81|12 years ago|reply
[+] [-] cl8ton|12 years ago|reply
[+] [-] zrail|12 years ago|reply
[+] [-] NikolaTesla|12 years ago|reply
[+] [-] ansimionescu|12 years ago|reply
[+] [-] toomuchtodo|12 years ago|reply
javascript:void(open('//web.archive.org/save/'+encodeURI(document.location)))
Which immediately saves the page to the Internet Archive. I haven't had time to look into if the Internet Archive has an API you can issue a URL to and it will fetch the content in the same way as above.
[+] [-] sp332|12 years ago|reply
You can do this with a client-side extension, like Resurrect Pages, which presents links to the Google cache (for recent sites under load) and the Internet Archive (for old links that have rotted). https://addons.mozilla.org/en-US/firefox/addon/resurrect-pag...
[+] [-] spindritf|12 years ago|reply
[+] [-] zrail|12 years ago|reply
[+] [-] gruseom|12 years ago|reply
[+] [-] elyrly|12 years ago|reply
Not daily but does the job
[+] [-] digitalsushi|12 years ago|reply
[+] [-] kumarski|12 years ago|reply
[+] [-] mperham|12 years ago|reply
[+] [-] zrail|12 years ago|reply
[+] [-] kumarski|12 years ago|reply