top | item 39934268

(no title)

IainIreland | 1 year ago

Those are the intended semantics of JS, but that doesn't help you when you're the one implementing JS. Somebody has to actually enforce those restrictions. Note that the code snippet is introduced with "JSArray::buffer_ can be thought of as a JSValue*, that is, a pointer to an array of JavaScript values", so there's no bounds checking on `buffer_[index]`.

It's easy enough to rewrite this C++ code to do the right bounds checking. Writing the code in Rust would give even stronger guarantees. The key point, though, is that those guarantees don't extend to any code that you generate at runtime via just-in-time compilation. Rust is smart, but not nearly smart enough to verify that the arbitrary instructions you've emitted will perform the necessary bounds checks. If your optimizing compiler decides it can omit a bounds check because the index is already guaranteed to be in-bounds, and it's wrong, then there's no backstop to swoop in and return undefined instead of reading arbitrary memory.

In short, JIT compilation means that it's ~impossible to make any safety guarantees about JS engines using compile-time static analysis, because a lot of the code they run doesn't exist until runtime.

discuss

order

foldr|1 year ago

>Those are the intended semantics of JS

They're actually not. Out of bounds indexing is fine, you just get undefined as the result.

datafilter|1 year ago

> If your optimizing compiler decides it can omit a bounds check

What is the reason for omitting OOB checks at all? Is it a big performance hit?

steveklabnik|1 year ago

It's the same as any code that won't run: there's no point in running it if it won't do anything.