top | item 43754083

(no title)

samspot | 10 months ago

When you achieve expertise you know when to break the rules. Until then it is wise to avoid premature optimization. In many cases understandable code is far more important.

I was working with a peer on a click handler for a web button. The code ran in 5-10ms. You have nearly 200ms budget before a user notices sluggishness. My peer "optimized" the 10ms click handler to the point of absolute illegibility. It was doubtful the new implementation was faster.

discuss

order

rvz|10 months ago

Depending on your spend on infrastructure and the business revenue, if the problem is not causing the business to increase spending on infrastructure each month or if there’s little to no rise in user complaints over slow downs, then the “optimization” isn’t worth it and is then premature.

Most commonly, If the costs increase as the users increase it then becomes an issue with efficiency and the scaling is not good nor sustainable which can easily destroy a startup.

In this case, the Linux kernel is directly critical for applications in AI, real time systems, networking, databases, etc and performance optimizations and makes a massive difference.

This article is a great example of properly using compiler optimizations to significantly improve performance of the service. [0]

[0] https://medium.com/@utsavmadaan823/how-we-slashed-api-respon...

adrianN|10 months ago

DBs can compile and run complex queries in that time budget. What did the click handler do?