top | item 47122302

(no title)

tgv | 6 days ago

There's no dynamic memory allocation with (100%) Spark. That's really limiting. You can to write "unsafe" code, but that has the same problems as Ada.

discuss

order

Findecanor|6 days ago

I thought SPARK got dynamic memory allocation when it adopted Rust-style ownership and borrowing in 2014.

potato-peeler|6 days ago

That is true for parsers like libjs, but again crypto module or even networking, can still be written in spark, which is much more safety critical.

Rochus|6 days ago

SPARK is not used for the whole system, but for the < 5% parts, which are safety/security-related in a good architecture.