(no title)
bognition | 4 months ago
Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.
Uv is lightning fast and damn easy to use. It’s so functional and simple.
bognition | 4 months ago
Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.
Uv is lightning fast and damn easy to use. It’s so functional and simple.
anitil|4 months ago
tclancy|4 months ago
saghm|4 months ago
mnky9800n|4 months ago
ziml77|4 months ago
I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.
uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.
Balinares|4 months ago
hk1337|4 months ago
hyperbovine|4 months ago
ThibWeb|4 months ago
perrygeo|4 months ago
rtpg|4 months ago
It's probably worth mentioning that Astral (The team behind uv/etc) has a team filled with people with a history of making very good CLI tooling. They probably have a very good sense for what matters in this stuff, and are thus avoiding a lot of pain.
Motivation is not enough, there's also a skill factor. And being multiple people working on it "full time"-ish means you can get so much done, especially before the backwards compat issues really start falling into place
scuff3d|4 months ago
simonw|4 months ago
lukeschlather|4 months ago
It sounds like uv is a drop-in replacement for pip, pipx, and poetry with all of their benefits and none of the downsides, so I don't see why I wouldn't migrate to it overnight.
bognition|4 months ago
Then I gave it a try and it just worked! It’s so much better that I immediately moved all my Python projects to it.
WD-42|4 months ago
Poetry which I think is the closest analogue, still requires a [tool.poetry.depenencies] section afaik.
walkabout|4 months ago
Or is the superior replacement actually up to the job this time?
dmd|4 months ago
kstrauser|4 months ago
I just found out they’re still making pipenv. Yes, if you’re using pipenv, I’m confident that uv will be a better experience in every way, except maybe “I like using pipenv so I can take long coffee breaks every time I run it”.
andy99|4 months ago
I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.
I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.
I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.
simonw|4 months ago
uv uses some very neat tricks involving hard links such that if you start a new uv-managed virtual environment and install packages into it that you've used previously, the packages are symlinked in. This means the new environment becomes usable almost instantly and you don't end up wasting filesystem space on a bunch of duplicate files.
This means it's no longer expensive to have dozens, hundreds or even thousands of environments on a machine. This is fantastic for people like myself who work on a lot of different projects at once.
Then you can use "uv run" to run Python code in a brand new temporary environment that get created on-demand within ms of you launching it.
I wrote a Bash script the other day that lets me do this in any Python project directory that includes a setup.py or pyproject.toml file:
That will run pytest with Python 3.11 (or 3.12/3.13/3.14/whatever version you like) against the current project, in a fresh isolated environment, without any risk of conflicting with anything else. And it's fast - the overhead of that environment setup is negligible.Which means I can test any code I like against different Python versions without any extra steps.
https://til.simonwillison.net/python/uv-tests
eslaught|4 months ago
1. If you edit any dependency, you resolve the environment from scratch. There is no way to update just one dependency.
2. Conda "lock" files are just the hashes of the all the packages you happened to get, and that means they're non-portable. If you move from x86 to ARM, or Mac to Linux, or CPU to GPU, you have to throw everything out and resolve.
Point (2) has an additional hidden cost: unless you go massively out of your way, all your platforms can end up on different versions. That's because solving every environment is a manual process and it's unlikely you're taking the time to run through 6+ different options all at once. So if different users solve the environments on different days from the same human-readable environment file, there's no reason to expect them to be in sync. They'll slowly diverge over time and you'll start to see breakage because the versions diverge.
P.S. if you do want a "uv for Conda packages", see Pixi [1], which has a lot of the benefits of uv (e.g., lock files) but works out of the box with Conda's package ecosystem.
[1]: https://pixi.sh/latest/
zbentley|4 months ago
When I first started using uv, I did not know what language it was written in; it was a good tool which worked far better than its predecessors (and I used pdm/pipenv/pyenv/etc. pretty heavily and in non-basic ways). I still don’t particularly care if it’s written in Rust or Brainfuck, it works well. Rust is just a way to get to “don’t bootstrap Python environments in Python or shell”.
> I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason.
I’m curious what issues you encountered. Were these bugs/failures of uv, issues using it in a specific environment, or workflow patterns that it didn’t support? Or something else entirely?
gre|4 months ago
cgearhart|4 months ago
morshu9001|4 months ago
The speed usually doesn't matter, but one time I did have to use it to auto figure out compatible deps in a preexisting project because the pip equivalent with backtracking was taking forever with CPU pegged at 100.
markkitti|4 months ago
testdelacc1|4 months ago
> could care less
I think “couldn’t care less” works better.
fragmede|4 months ago
Both of them manage venvs, but where the venv goes (by default) makes a difference, imo. Conda defaults to a user level directory eg ~/.conda/envs/my-venv. uv prefers a .venv dir in the project's folder. It's small, but it means per-project venvs are slightly more ergonomic with uv. Wereas with conda, because they're shared under homedir, it's easy to get lazy once you have a working venv and reuse that good working venv across multiple programs, and then it breaks when one program needs its dependencies updated and now it's broken for all of them. Naturally that would never happen to a skilled conda operator, so I'll just say per-project uv venv creation and recreation flows just that tiny bit smoother, because I can just run "rm -rf .venv" and not worry about breaking other things. One annoyance I have with uv is that it really wants to use the latest version of python it knows about, and that version is too new for a program or one of its dependencies, and the program won't run. Running "uv venv --python 3.12" instead of"uv venv" isn't onerous, but it's annoying enough to mention. (pyproject.toml lets projects specify version requirements, but they're not always right.) Arguably that's a python issue and not uv's, but as users, we just want things to work, dammit. That's always the first thing I look for when things don't work.
As mentioned, with uv the project venv lives in .venv inside the project's directory which lets "uv run program.py" cheat. Who amongst us hasn't forgotten to "source .venv/bin/activate" and been confused when things "suddenly" stopped working. So if you're in the project directory, "uv run" will automatically use the project's .venv dir.
As far as it being pushed to promote rust. I'm sure there's a non-zero amount of people for whom that's true, but personally as that makes it harder to contribute to uv, it's actually a point against it. Sometimes I wonder how fast it would be if it was written in python using the same algorithms, but run under pypy.
Anyway, I wouldn't say any of that's revolutionary. Programs exist to translate between the different project file types (requirements.txt/environment.yml/pyproject.toml) so if you're already comfortable with conda and don't want to use uv, and you're not administering any shared system(s), I'd just stick the command to generate environment.yml from pyproject.toml on a cheat sheet somewhere.
---
One bug I ran into with one of the condas; I forgot which, is that it called out to pip under the hood in interactive mode and pip got stuck waiting for user input and that conda just sat there waiting for input that would never come. Forums were filled with reports by users talking about letting it run for hours or even days. I fixed that, but it soured me on *conda, unfortunately.
atoav|4 months ago
Absolute no-brainer.
pjmlp|4 months ago
I use Python since version 1.6, mainly for OS scripting, because I rather use something with JIT/AOT in the box for application software.
Still, having a little setup script to change environment variables for PYTHONPATH, PATH and a few other things, always did the trick.
Never got to spend hours tracking down problems caused by the multiple solutions that are supposed to solve Python's problems.
WhyNotHugo|4 months ago
Last I tried it, it insisted on downloading a dynamically linked Python and installing that. This obviously doesn't work, you can't distribute dynamically linked binaries for Linux and expect them to work on any distribution (I keep seeing this pattern and I guess it's because this typically works on macOS?).
Moreover my distribution already has a package manager which can install Python. I get that some absolute niche cases might need this functionality, but that should most definitely be a separate tool. The problem isn't just that the functionality is in the same binary, but also that it can get triggered when you're using another of its functionalities.
I wish this had been made into actual separate tools, where the useful ones can be adopted and the others ignored. And, most important, where the ecosystem can iterate on a single tool. Having "one tool that does 5 things" makes it really hard to iterate on a new tools that does only one of those things in a better way.
It's pretty disappointing to see the Python ecosystem move in this direction.
Balinares|4 months ago