(no title)
whatevermom5 | 2 months ago
For example, when you gzip a Base64-encoded picture, you end up 1. encoding it in base64 (takes a *lot* of CPU) and then, compressing it (again! jpeg is already compressed).
I think what it boils down to is scale; if you are running a small shop and performance is not critical, sure, do everything in HTTP/1.1 if that makes you more productive. But when numbers start mattering, designing binary protocols from scratch can save a lot of $ in my experience.
socketcluster|2 months ago
For example, I've seen a lot of companies obsess over minor stuff like shaving a few bucks off their JSON serialization or using a C binding of some library to squeeze every drop of efficiency out of those technologies... While at the same time letting their software maintenance costs blow out of control... Or paying astronomical cloud compute bills when they could have self-hosted for 1/20th of the price...
Also, the word scale is overused. What is discussed here is performance optimization, not scalability. Scalability doesn't care for fixed overhead costs. Scalability is about growth in costs as usage increases and there is no difference in scalability if you use ProtoBuf or JSON.
The expression that comes to mind is "Penny-wise, pound-foolish." This effect is absolutely out of control in this industry.
whatevermom5|1 month ago
panstromek|2 months ago