codingjester | 11 years ago | on: Researching a new form of HTTP caching optimization
codingjester's comments
codingjester | 11 years ago | on: Researching a new form of HTTP caching optimization
You can use Vary for tons of caching optimizations via varnish, such as caching mobile web-pages vs non-mobile web pages or just a particular header. It's all about just flexing a little bit of VCL (which I'll admit sometimes can throw people off).
They had me until this part:
> This is an HTTP cache built directly in Passenger so that it can achieve much higher performance than external HTTP caches like Varnish.
And since they have no benchmarks to really back up these claims, I'm skeptical they did much research against Varnish to tune or set it up. I'd love to see the numbers on varnish vs their turbocache. Without numbers, I have to take a lot of it with a grain of salt.
Either way, seems like it could be an extra handy thing to have in your toolbox, as long as it fits your stack.
codingjester | 11 years ago | on: Tumblr: Hashing Your Way to Handling 23,000 Blog Requests per Second
I definitely agree that I wouldn't use Redis (or memcache for that matter) for storing entire pages and should be used for more of an object cache. Even then, we use memcache for "simple" data structures and when we need more complex data structures, will use Redis.
Redis is great if you need some kind of persistence as well (and it's fairly tunable), where as memcache and varnish, are completely in memory (varnish 4.0 I believe is introducing persistence). So you kick the process, and that's all she wrote for your cache until it gets warm again. (Which has its own challenges).
Varnish also gives you a language called VCL to play around with to maximize your caching strategies and optimize the way varnish should be storing things. It's got an easy API to purge content, when you need to purge it and it should support compression for your pages out of the box without too much tuning.
If you're having issues just speeding up static content, give varnish a whirl. Spend some time with it, and you won't be disappointed.
I believe you can also look into using nginx as a caching alternative to cache responses, but I don't have too much experience with that. I've heard it used with some success though.
codingjester | 12 years ago | on: Show HN: API for Sales Research
codingjester | 12 years ago | on: Show HN: API for Sales Research
codingjester | 12 years ago | on: Python API
We have an official python client that I work on when I get time here: https://github.com/tumblr/pytumblr, Though I know there are tons of other awesome wrappers for v2 out there.
codingjester | 15 years ago | on: Got Hacking? Git Hacking.
codingjester | 15 years ago | on: Got Hacking? Git Hacking.
codingjester | 15 years ago | on: Got Hacking? Git Hacking.
I was just more or less intrigued by the actual claim and would have loved to just see some numbers on "higher performance than varnish" and the scenarios where it could reach those performance pieces. It just piqued my interest for future architecture considerations.
Thanks for enlightening me more on turbocache. It's on my list of things to go through tonight.
Edit: I did want to point out my bad choice of words. "They had me until here.", did not mean I didn't read the article in it's entirety. Just that I continued with a bit more skepticism.