(no title)
_vvhw | 3 years ago
It's also made TigerBeetle's code more reliable, because tests can assert that limits are never exceeded. This has detected rare leaks that might otherwise have only been detected in production.
_vvhw | 3 years ago
It's also made TigerBeetle's code more reliable, because tests can assert that limits are never exceeded. This has detected rare leaks that might otherwise have only been detected in production.
Twisol|3 years ago
_vvhw|3 years ago
pierrebai|3 years ago
I mean it's not like memory exhaustion is a common occurrence nowadays. These kind of compromises, where one trades an almost non-existent failure mode for a fragile assumption-ridden code base does not sounds like a wise choice.
I mean, if the particular process is so tightly-integrated that you can know that you have a 1:1 relationship between X and Y, you can also tie those together with dynamic allocations. I find it hard to believe that you can easily statically show that X and Y allocations are tied in a static allocation scheme but that it would not be so under dynamic allocation?