top | item 24332780

(no title)

hadsed | 5 years ago

There's an important thing missing from the HPC community, and that is a focus on software usability.

Maybe things have changed since I worked as a scientist in training at a government lab with one of these crazy machines. But I doubt it.

Trying to use such a computer efficiently was one of the things that took me to my current career as an engineer (and the scientific background landed me in ML), and that was a great thing for me. But for a scientist who is focused on science and not computer architecture, distributed algorithms, programming quirks of working with accelerators (thank you deep learning, because writing CUDA code was painful), they are extremely unproductive when they don't get the help they need.

So I'm fairly skeptical that we're really getting the best return on our investment with Cray and IBM at the helm of supercomputing.

But I have to confess I don't know what I would do to fix it. One thing comes to mind: hire a software tools team at these labs to build a nice software layer to make the rest of the scientific teams more productive. But this is very much what I would do as a company executive, not director of a government science lab. I'm not sure what constraints exist on the finance or politics side. But a change is definitely going to require some leadership.

discuss

order

skyde|5 years ago

What about language designed for HPC like Chapel and Fortress. Or any nested data-parallel languages.

All we need is An easy way to do safe stateful parallel algorithm using Distributed transaction or purely functional data parallelism over any shape of Data structure ( tree, graph ...)!