top | item 39389139

(no title)

daniel_grady | 2 years ago

What are some of the reasons that teams use conda (and related tools) today? As a machine learning scientist, I used conda exclusively in the mid-2010s because it was the only framework that could reliably manage Python libraries like NumPy, PyTorch, and so on, that have complex binary dependencies. Today, though, pip install works fine for those packages. What am I missing?

discuss

order

blactuary|2 years ago

For me personally, I prefer conda because it is dependency resolution (mamba), virtual environments, and a package repository (conda-forge) all from one base miniconda installation. And for all of my use cases, all of those just work. Dependency solving used to be painfully slow, mamba solved that. Packages used to be way behind the latest, setting conda-forge as my default solved that.

After fiddling with different solutions for years and having to start fresh with a new Python install, I've been using nothing by miniconda for years and it just works

Ringz|2 years ago

Unfortunately, far too often: tradition.

Using only „Pythons native tools“ like pip and venv simply works nowadays so good that I wonder about the purpose of many tools like poetry etc. etc.

gcarvalho|2 years ago

For me it's the easiest and fastest cross-platform way to consistently install a Python version.

pip and venv work fine, but you have to get them first; and that can be a struggle for unseasoned python devs, especially if you need a version that's not what your distro ships, and even more so on Windows and macOS.

I use micromamba [1] specifically, which is a single binary.

[1] https://mamba.readthedocs.io/en/latest/user_guide/micromamba...

pininja|2 years ago

Another reason I used to use conda was for easy native Windows installation. GPU accelerated packages like OpenCV were especially difficult when I used use it 6 years ago. Now there’s Linux subsystem.. has pip support dramatically improved on Windows?

nateglims|2 years ago

The biggest advantage for poetry I found, working with a lot of non-traditional software people, is that it does a lot of things by default like pin versions and manage virtual envs. Unfortunately, it does complicate some things.

dragonwriter|2 years ago

> Today, though, pip install works fine for those packages.

pip install works, but pip's dependency management doesn't seem to (for Pytorch, specifically) which is why projects that have pip + requirements.txt as one of their installation methods will often have separate pytorch installation instructions when using that method, though if the same project supports conda installation it will be a one-stop-shop installation that way.

daniel_grady|2 years ago

> pip's dependency management doesn't seem to (for Pytorch, specifically)

That’s interesting — I’ve also had difficulties with PyTorch and dependency resolution, but only on the most recent versions of Python, for some period of time after they’re released. Picking Python 3.9 as a baseline for a project, for example, has been very reliable for PyTorch and all the related tooling.

tehnub|2 years ago

One reason to choose one over the other is the dependencies they’re bundled with. Take numpy. With PyPI, it’s bundled with OpenBLAS, and with conda, it’s bundled with Intel MKL, which can be faster. See https://numpy.org/install/#

daniel_grady|2 years ago

That’s a great point; I didn’t know about that!