top | item 42467994

(no title)

alsodumb | 1 year ago

As someone with admittedly no formal CS education, I've been using conda for all of my grad school and never managed to break it.

I create a virtual environment for every project. I install almost all packages with pip, except for any binaries or CUDA related things from conda. I always exported the conda yaml file and managed to reproduce the code/environment including the Python version. I've seen a lot of posts over time praising poetry and other tools and complaining about conda but I could never relate to any of them.

Am i doing something wrong? Or something right?

discuss

order

duped|1 year ago

My experience with conda is that its fine if you're the original author of whatever you're using it for and never share it with anyone else. But as a professional I usually have to pull in someone else's work and make it function on a completely different machine/environment. I've only had negative experiences with conda for that reason. IME the hard job of package management is not getting software to work in one location, but allowing that software to be moved somewhere else and used in the same way. Poetry solves that problem, conda doesn't.

Poetry isn't perfect, but it's working in an imperfect universe and at least gets the basics (lockfiles) correct to where packages can be semi-reproducible.

There's another rant to be had at the very existence of venvs as part of the solution, but that's neither poetry or anaconda's fault.

LarsDu88|1 year ago

Poetry is pretty slow. I think `uv` will ultimately displace it on that basis alone.

alkh|1 year ago

+1. On top of that, even with the new resolver it still takes ages to resolve a dependency for me, so somethimes I end up just using pip directly. Not sure if I am doing something wrong(mb you have to manually tweak something in the configs?) but it's pretty common for me to experience this

throwawaymaths|1 year ago

imagine being a beginner to programming and being told "use venvs"

or worse, imagine being a longtime user of shells but not python and then being presented a venv as a solution to the problem that for some reason python doesn't stash deps in a subdirectory of your project

theamk|1 year ago

You are doing something right, author does some pretty unusual things:

- Setup custom kernels in Jupyter Notebook

- Hardlink the environments, then install same packages via pip in one and conda in others

- install conda inside conda (!!!) and enter nested environment

- Use tox within conda

I believe as long as you treat the environments as "cattle" (if it goes bad, remove it and re-create from yaml file), you should not have any problems. It's clearly not the case of for the post's author though.

fluorinerocket|1 year ago

Yep nuke the bad env and start over. Conda is great only problem are when a package is not available on conda forge or you have to compile and install with setup.py. But then you can blow the env away and start over.

bean-weevil|1 year ago

As someone with a formal computer science, half of my friends who work in other sciences have asked me to help them fix their broken conda environments

rcxdude|1 year ago

This is exactly the kind of thing that causes python package nightmares. Pip is barely aware of packages it's installed itself, let alone packages from other package managers and especially other package repositories. Mixing conda and pip is 100% doing it wrong (not that there's an easy way to do it right, but stick to one or the other, I would generally recommend just using pip, the reasons for conda's existance are mostly irrelevant now)

skeledrew|1 year ago

I still run into cases where a pip install that fails due to some compile issue works fine via conda. It's still very relevant. It's pip that should be switched out for something like poetry.

whywhywhywhy|1 year ago

Works absolutely fine as possible with Python using conda to manage the environments and python versions and pip to install the packages.

maurosilber|1 year ago

I had the same experience. But you should try pixi, which is to conda what uv is to pip.

akdor1154|1 year ago

Isn't uv to conda what uv is to pip?

jszymborski|1 year ago

God forbid you should require conda-forge and more than three packages lest the dependency resolver take longer than the heat death of the planet to complete.

fransje26|1 year ago

Install mamba first?

fluorinerocket|1 year ago

Same but I try to use conda to install everything first, and only use pip as a last resort. If pip only installs the package and no dependency it's fine

throwawaymaths|1 year ago

i think you got lucky and fell into best practices on your first go

> except for any binaries or CUDA related things from conda

doing the default thing with cuda related python packages used to often result in "fuck it, reinstall linux". admittedly, i dont know how it is now. i have one machine that runs python with a gpu and it runs only one python program.

disgruntledphd2|1 year ago

> doing the default thing with cuda related python packages used to often result in "fuck it, reinstall linux"

From about 2014-17 you are correct, but it appears (on ubuntu at least), that it mostly works now. Maybe I've just gotten better at dealing with the pain though...

thangngoc89|1 year ago

1. You need to run export manual while other tools you mentioned would create it automatically (the lock file) 2. Distinguishes between direct dependencies (packages you added yourself) and indirect dependencies (packages of the packages)