codingjester's comments

codingjester | 11 years ago | on: Researching a new form of HTTP caching optimization

Fair enough! I can definitely see how I might have taken it too hard (or it came off in my writing of the comment)

I was just more or less intrigued by the actual claim and would have loved to just see some numbers on "higher performance than varnish" and the scenarios where it could reach those performance pieces. It just piqued my interest for future architecture considerations.

Thanks for enlightening me more on turbocache. It's on my list of things to go through tonight.

Edit: I did want to point out my bad choice of words. "They had me until here.", did not mean I didn't read the article in it's entirety. Just that I continued with a bit more skepticism.

codingjester | 11 years ago | on: Researching a new form of HTTP caching optimization

I think it's been said enough here but Varnish can certainly do almost everything that they are saying it can't do, including some other storage optimizations like storing the gzipped response and serving a non-gzipped when requested (as of Varnish 3.0).

You can use Vary for tons of caching optimizations via varnish, such as caching mobile web-pages vs non-mobile web pages or just a particular header. It's all about just flexing a little bit of VCL (which I'll admit sometimes can throw people off).

They had me until this part:

> This is an HTTP cache built directly in Passenger so that it can achieve much higher performance than external HTTP caches like Varnish.

And since they have no benchmarks to really back up these claims, I'm skeptical they did much research against Varnish to tune or set it up. I'd love to see the numbers on varnish vs their turbocache. Without numbers, I have to take a lot of it with a grain of salt.

Either way, seems like it could be an extra handy thing to have in your toolbox, as long as it fits your stack.

codingjester | 11 years ago | on: Tumblr: Hashing Your Way to Handling 23,000 Blog Requests per Second

so @ tumblr, we're using varnish for a full page cache (we use it for parts of the API as well for response caching), and invalidate when a blog updates (or your page can just TTL out).

I definitely agree that I wouldn't use Redis (or memcache for that matter) for storing entire pages and should be used for more of an object cache. Even then, we use memcache for "simple" data structures and when we need more complex data structures, will use Redis.

Redis is great if you need some kind of persistence as well (and it's fairly tunable), where as memcache and varnish, are completely in memory (varnish 4.0 I believe is introducing persistence). So you kick the process, and that's all she wrote for your cache until it gets warm again. (Which has its own challenges).

Varnish also gives you a language called VCL to play around with to maximize your caching strategies and optimize the way varnish should be storing things. It's got an easy API to purge content, when you need to purge it and it should support compression for your pages out of the box without too much tuning.

If you're having issues just speeding up static content, give varnish a whirl. Spend some time with it, and you won't be disappointed.

I believe you can also look into using nginx as a caching alternative to cache responses, but I don't have too much experience with that. I've heard it used with some success though.

codingjester | 12 years ago | on: Show HN: API for Sales Research

Where can I report bad data for your service? Has all the wrong info for me, excluding my name and picture. I'm guessing it's got to be from LinkedIn?

codingjester | 12 years ago | on: Python API

For the Tumblr API, the link to the client is for our v1 API which has been deprecated in favor of our v2 API.

We have an official python client that I work on when I get time here: https://github.com/tumblr/pytumblr, Though I know there are tons of other awesome wrappers for v2 out there.

codingjester | 15 years ago | on: Got Hacking? Git Hacking.

Shit. We totally didn't even think about mobile our bad. Once we get some sleep and probably get back from our day jobs, we'll see what we can do about it.
page 1