(no title)
joppy | 9 months ago
One thing an conda package can do which an PyPI package cannot is have binary dependencies: a conda package is linked upon installation, and packages can declare dependencies on shared libraries. As common example is numeric libraries depending on a BLAS implementation: in a conda/pixi environment you will get exactly one BLAS shared library linked into your process, used by numpy, scipy, optimisers, etc. For some foundational libraries like BLAS which have multiple implementations, the user even has the power to consistently switch the implementation within the environment, eg from OpenBLAS to Intel’s MKL.
The PyPI package format does not allow binary dependencies: wheels must be self-contained when it comes to binary code (not when it comes to Python code - which hopefully makes it clear that something here is inconsistent). Take any numerical python environment and enumerate the copies of BLAS you have, it is probably 3-5. All running their own threadpools.
Another very simple example is with inbuilt modules depending on native code, like the sqlite3 module. In a conda/pixi installation you are guaranteed that the python binary links against the same sqlite3 code as the command-line sqlite3 cli tool in the same environment. Stuff like this removes many cross-language or cross-tool hassles.
I prefer uv or poetry if I’m doing anything simple or pure python (or perhaps with a small binary dependency like an event loop). But pixi is the way to go for large environments with lots of extra tools and numerical libraries.
No comments yet.