top | item 24042764

(no title)

heyitsme | 5 years ago

does that mean that if one starts a new julia kernel, and imports a package, something different happens each time they do that? if not, then it would seem one could at least save on those super long imports. Reading about this a bit more, it almost seems like whatever PackageCompiler.jl is doing could be automated and baked into the the core julia executable with simple options/flags.

discuss

order

eigenspace|5 years ago

There will likely be more caching in the future, but it's a hard problem and for now, they're working on lower hanging fruit to speed up compile times.

> Reading about this a bit more, it almost seems like whatever PackageCompiler.jl is doing could be automated and baked into the the core julia executable with simple options/flags.

Not really, no. Fundamentally, PackageCompiler is building a monolithic executable with your desired packages baked into it. Every time you want to add a new method to compile, you need to rebuild the whole thing and you cause it to be larger on the disk and slower to start up.

Even if we could quickly cache methods, I have hundreds of Julia packages installed locally on my machine. and regularly call methods with exotic signatures. If I baked every method I ever compiled into my sysimage, it'd probably be hundreds of terabytes in size at least. You have to remember that every time I call a function f on arguments x, y, and z, I need to compile a new method for each distinct signature

    f(::typeof(x), ::typeof(y), ::typeof(z))
There are more Julia function signatures that it's possible to create and compile from just Base functions and types than there are atoms in the universe.

Of course, it's possible to do more caching and faster than we currently do, but I just want to emphasize that it's a hard problem.