top | item 47077348

(no title)

zamalek | 10 days ago

> Cross-vendor GPU support: A single codebase runs on AMD, NVIDIA, and CPU via KernelAbstractions.jl

This is why I wish Julia were the language for ML and sci comp in general, but Python is sucking all of the air out of the room.

discuss

order

jampekka|10 days ago

Maybe because Python can reasonably used to make actual applications instead of just notebooks or REPL sessions.

yellowapple|10 days ago

What's stopping Julia from being reasonably usable to make actual applications? It's been awhile since I've touched it, but I ain't seeing a whole lot in the way of obstacles there — just less inertia.

mathisfun123|10 days ago

i hope you realize this is purely because julia uses LLVM and LLVM has backends for those targets (noticeably absent are GPUs which do not have LLVM backends). any other language which uses LLVM could do the same exact same thing (and would be hampered in the exact same way).

majoe|9 days ago

Probably true, but one unique thing about Julia is, that exposes almost all stages of the compilation to the user. From typed IR to native code generation you can customise the compilation in many ways. Together with the power of LISP's metaprogramming features, that's a really fine basis for powerful and performamt DSLs and code transformations.

All those GPU targets are powered by libraries, that are not part of Julia itself (GPUCompiler.jl). The same goes for automatic differentiation. That's remarkable in my opinion.

So you're right, that many programming languages could do it, but it's no wonder, that other languages are lacking in this regard compared to Julia.