top | item 31028584

(no title)

ViralBShah | 3 years ago

"Small" is difficult, and it is actually in many cases, harder than "big". Small needs language support, GC support, runtime support, and is delicate - any one thing can throw off performance. You can't hide behind library calls. In Julia, since we can orchestrate everything, from the abstractions all the way down to the instructions, making small problems work well has been possible.

A single grad student can make large problems work (my thesis was large linear algebra and I could just hack away on enough C and MPI to get it done). In our early days on Julia, we realized that 90% of the world actually needs small linear algebra and it is a tantalizingly difficult problem. The work done in the Julia community over the years has made it all possible through a collaboration across lots of different teams and disciplines.

discuss

order

dekhn|3 years ago

I think this is a pretty key point- the majority of users haven't been well-served by large-scale compute. I've heard from a number of folks in genomics that all they need is a faster way to invert a "big" matrix- and when they show me the matrix, it's tiny compared to what state of the art supercomputers are working on.

hpcjoe|3 years ago

One of the first questions I ask when discussing issues like this, is to define "big" and "small" for me. Every group has their own set of definitions, and the differences can result in interesting conversations.

adgjlsfhk1|3 years ago

That's especially funny because inverting a matrix is almost never what you want to do anyway.