top | item 21476267

(no title)

RMarcus | 6 years ago

My experience is entirely different -- writing HPC code for supercomputers at Los Alamos National Lab (on and off for 5 years) made me a true Rust believer.

One of the things I spent the most time on with Fortran / C++ codes was debugging wrong-result bugs. About 90% of the time, the wrong result came from some edge-case where an array was wrongly freed too early, an array was accessed out of scope, or a race condition caused an array member to be updated in a non-deterministic manner. Each of these bugs required hours of debugging and was a huge time sink. Once I started working with Rust, I never encountered any of these bugs. After about a year of fighting the borrow-checker, I feel my overall efficiency has greatly improved.

Now, when I go back and write or read C++ code, patterns that the Rust compiler would yell about jump out at me (multiple unprotected mutable references, cloning unique pointers), and I find these are generally a source of the bug I'm hunting. Like sibling comments point out, a lot (but not all) of the things Rust stops you from doing are just bad practice anyway.

Of course, for GPGPU stuff I have to write CUDA or OpenCL, but those are generally small, compact kernels that are easy to reason about end to end.

I'm not suggesting that you are doing this, but for me, I initially resisted Rust for a long time. Rust seemed extremely complex, and whenever I'd try to use it I would run into a wall. The loud Rust community talking about how great Rust was and how easy it was to use once you "got it" made me feel stupid. Instead of being humble, I became arrogant, and I'd say things like "Rust is too restrictive for the high performance applications I care about" or "I write code that Rust would find unsafe but is actually super well-tuned for this architecture." For me, these were mental excuses I made because I was unable to accept that I was having such a hard time with Rust, and I considered myself a "high performance computing software engineer!"

It took me way longer than most to "get" Rust -- over a year of repeatedly forcing myself to learn and stumble through compiler errors before things started to click. A year after that, and I'm still frequently surprised by certain aspects of the language ("really? I need a & in that match statement?" and "oh god, what does this lifetime and trait bound mean..." are two of the most common). But the parts of Rust that have clicked for me (the borrow checker and associated lifetime mechanics) make Rust very enjoyable to write.

Again, I'm not suggesting that you are falling into the same trap I did, I just wanted to post this to encourage anyone else in the "banging their head against the Rust compiler" stage to power through!

discuss

order

keldaris|6 years ago

Thank you for sharing your experience! I'd be very curious to hear more about your experience in doing GPGPU work in Rust - it was my understanding that there was virtually no tooling, libraries or support for that kind of thing beyond the existence of the C FFI.

On the broader point, I suspect a large part of the reason we've had such contrasting experiences is just a radically different mindset behind the C++ codebases we've dealt with. Wrongly freed or out of scope arrays scream of exactly the kind of C++ code Rust was designed to address, and as far as I can tell it is indeed great at doing that. On the opposite extreme, when you have statically determined sizes and bounds, all allocations happen at startup and nothing ever gets freed, that entire class of issues simply doesn't arise in the first place. The reason why the overwhelming majority of the bugs I debug are either silly typos or plain logic errors isn't because I'm particularly good at this, it's just a different approach to programming that's easy to pull off in simulation code (or embedded systems, or game engines), but probably rather more difficult in other kinds of applications.

Anyway, I'm glad you're enjoying Rust and I hope it'll have more of a scientific / numerics / GPGPU ecosystem in the future. More viable languages can only be a good thing for us computational scientists.