top | item 45574104

(no title)

bognition | 4 months ago

This shouldn’t be a surprise to anyone who has been using Python and has tried uv.

Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.

Uv is lightning fast and damn easy to use. It’s so functional and simple.

discuss

order

anitil|4 months ago

For me the most convincing argument was that it took ~3 minutes to go from 'I wonder if I should give this thing a try' to 'oh it .... it worked!?'

tclancy|4 months ago

Yeah, been doing this for over twenty years and finally got a chance to start playing with it a few months back and was confused at how I got that far that fast.

saghm|4 months ago

As someone who also hasn't really used any of the past 8 years or so of Python dependency management, it's nice that it seems to support using arbitrary other tooling as well. At some point recently I wanted to run something that happened to use pdm, which I hadn't even heard of, but I was able to invoke it with `uv tool run pdm` and not have to learn anything about how to set it up manually.

mnky9800n|4 months ago

This is how I explain it to the stragglers. Just try it because you will suddenly in about 1 to 3 minutes not know how to go back. Haha.

ziml77|4 months ago

It really is!

I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.

uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.

Balinares|4 months ago

I just wish uv made it more straightforward to have arbitrary purpose-specific virtual environments, e.g. for building the package, for running the test suite, for dev tooling (PuDB ), etc. That's one thing pixi does better, I think.

hk1337|4 months ago

It’s a little too fast, I’m having trouble believing it’s actually doing anything sometimes.

hyperbovine|4 months ago

uv is so over-the-top fast compared to what we're used to that I would argue it's actually bad for the language. Suddenly it dawns on you that by far the most capable and performant package manager (and linter) (and code formatter) (and type checker) for Python is in fact not written in Python. Leaves an odd taste. Makes you wonder what else ought not be written in Python ... or why anything should be written in Python. Here be dragons ...

ThibWeb|4 months ago

for me the surprise is the pace? I’d expect people to be more set in their tools that it takes longer than a few months for a new tool, no matter how good, to become the majority use one. Though perhaps people adopt new tools more easily in CI where install times matter more

perrygeo|4 months ago

The pace of uv adoption is insanely fast. It's directly related to how bad the previous Python tools were/are. Even to seasoned veterans set in their ways - they still know a better solution when they see it.

rtpg|4 months ago

uv having a good pip compatibility layer probably helped a lot, because you could try things out that way and see what fit, so to speak.

It's probably worth mentioning that Astral (The team behind uv/etc) has a team filled with people with a history of making very good CLI tooling. They probably have a very good sense for what matters in this stuff, and are thus avoiding a lot of pain.

Motivation is not enough, there's also a skill factor. And being multiple people working on it "full time"-ish means you can get so much done, especially before the backwards compat issues really start falling into place

scuff3d|4 months ago

uv was really smart in the way they integrated with existing solutions. My whole team just switched over from pip, and it was painless. We were already using pyproject.toml files which made it even easier, but uv also has documentation for transitioning from requirements.txt files.

simonw|4 months ago

uv first came out 15th February 2024 so it's a year and a half old now. Still pretty impressive for it to get adoption this fast though.

lukeschlather|4 months ago

I feel like I've tried at least 5 different package management tools for python. Between pip, poetry, pip-tools, pipx, I'm not really sure what easy_install, egg, pkg_info are, but I do know I have always been surprised I need to care.

It sounds like uv is a drop-in replacement for pip, pipx, and poetry with all of their benefits and none of the downsides, so I don't see why I wouldn't migrate to it overnight.

bognition|4 months ago

Honestly, I was skeptical when I learned about uv. I thought, just Python needs, another dependency manager… this was after fighting with pip, venv, venvwrapper, and poetry for years.

Then I gave it a try and it just worked! It’s so much better that I immediately moved all my Python projects to it.

WD-42|4 months ago

I think it’s been long enough now. Uv just has so much velocity. Pyproject.toml and pep support just keeps getting better.

Poetry which I think is the closest analogue, still requires a [tool.poetry.depenencies] section afaik.

walkabout|4 months ago

Is this like when everyone on here had already been saying Yarn was a no-brainer replacement for npm, having totally obsoleted it, for like two-plus years, but it was still lacking safety/sanity checks, missing features, and broke in bizarre ways on lots of packages in-the-wild?

Or is the superior replacement actually up to the job this time?

dmd|4 months ago

It really is just that good. That is why it's had such massive uptake. No matter how many times you've been burned before, no matter how skeptical you are, it's so good that, seriously, just try it, and you'll be instantly converted.

kstrauser|4 months ago

I’m certain there’s going to be so bizarre edge case where pip is fine and uv isn’t. It’s inevitable. However, in every situation where I’ve used it, pip is better than pip or poetry or any other package manager I’ve ever used.

I just found out they’re still making pipenv. Yes, if you’re using pipenv, I’m confident that uv will be a better experience in every way, except maybe “I like using pipenv so I can take long coffee breaks every time I run it”.

andy99|4 months ago

I’ll bite - I could care less about speed, that feels like a talking point I see often repeated despite other package managers not being particularly slow. Maybe there’s some workload I’m missing that this is more important for?

I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.

I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.

I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.

simonw|4 months ago

The place where speed really matters is in virtual environment management.

uv uses some very neat tricks involving hard links such that if you start a new uv-managed virtual environment and install packages into it that you've used previously, the packages are symlinked in. This means the new environment becomes usable almost instantly and you don't end up wasting filesystem space on a bunch of duplicate files.

This means it's no longer expensive to have dozens, hundreds or even thousands of environments on a machine. This is fantastic for people like myself who work on a lot of different projects at once.

Then you can use "uv run" to run Python code in a brand new temporary environment that get created on-demand within ms of you launching it.

I wrote a Bash script the other day that lets me do this in any Python project directory that includes a setup.py or pyproject.toml file:

  uv-test -p 3.11
That will run pytest with Python 3.11 (or 3.12/3.13/3.14/whatever version you like) against the current project, in a fresh isolated environment, without any risk of conflicting with anything else. And it's fast - the overhead of that environment setup is negligible.

Which means I can test any code I like against different Python versions without any extra steps.

https://til.simonwillison.net/python/uv-tests

eslaught|4 months ago

Conda doesn't do lock files. If you look into it, the best you can do is freeze your entire environment. Aside from this being an entirely manual process, and thus having all the issues that manual processes bring, this comes with a few issues:

1. If you edit any dependency, you resolve the environment from scratch. There is no way to update just one dependency.

2. Conda "lock" files are just the hashes of the all the packages you happened to get, and that means they're non-portable. If you move from x86 to ARM, or Mac to Linux, or CPU to GPU, you have to throw everything out and resolve.

Point (2) has an additional hidden cost: unless you go massively out of your way, all your platforms can end up on different versions. That's because solving every environment is a manual process and it's unlikely you're taking the time to run through 6+ different options all at once. So if different users solve the environments on different days from the same human-readable environment file, there's no reason to expect them to be in sync. They'll slowly diverge over time and you'll start to see breakage because the versions diverge.

P.S. if you do want a "uv for Conda packages", see Pixi [1], which has a lot of the benefits of uv (e.g., lock files) but works out of the box with Conda's package ecosystem.

[1]: https://pixi.sh/latest/

zbentley|4 months ago

> I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.

When I first started using uv, I did not know what language it was written in; it was a good tool which worked far better than its predecessors (and I used pdm/pipenv/pyenv/etc. pretty heavily and in non-basic ways). I still don’t particularly care if it’s written in Rust or Brainfuck, it works well. Rust is just a way to get to “don’t bootstrap Python environments in Python or shell”.

> I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason.

I’m curious what issues you encountered. Were these bugs/failures of uv, issues using it in a specific environment, or workflow patterns that it didn’t support? Or something else entirely?

gre|4 months ago

You've never waited 10 minutes for conda to solve your environment and then say it's unsolvable?

cgearhart|4 months ago

I’ve been trying uv lately to replace my normal workflow of selecting a python with pyenv for the shell, then making a venv, then installing a bunch of default packages (pandas, Jupyter, etc). So far the only benefit is that I can use just the one tool for what used to take 3 (pyenv, venv, pip). I don’t _hate_ it…but it really isn’t much of an improvement.

morshu9001|4 months ago

uv is comparable to npm. All your deps get auto tracked in a file. There are other things that do this, but pip isn't one of them, and I vaguely remember the others being less convenient.

The speed usually doesn't matter, but one time I did have to use it to auto figure out compatible deps in a preexisting project because the pip equivalent with backtracking was taking forever with CPU pegged at 100.

markkitti|4 months ago

What tooling do you use?

testdelacc1|4 months ago

Everyone downvoting you and disagreeing - don’t listen to them! I’m here to tell you that there is a massive conspiracy and everyone is in on it. Commenters on HN get paid every time someone downloads a Rust tool, that’s why they’re trying to convince you to use uv. It’s definitely not because they used it and found it worked well for them.

> could care less

I think “couldn’t care less” works better.

fragmede|4 months ago

Being forced to use a tool you don't want to use sucks, no matter how awesome that tool may or may not actually be. *conda and uv have roughly the same goals which means they're quite similar. For me, the speed of uv really does set it apart. For python programs with lots of dependencies, it's faster enough that I found it worth it to climb its learning curve. (ChatGPT makes that curve rather flat.) pip install -r requirements.txt went from a coffee break to me watching uv create the venv. But okay, speed gains aren't going to convince you.

Both of them manage venvs, but where the venv goes (by default) makes a difference, imo. Conda defaults to a user level directory eg ~/.conda/envs/my-venv. uv prefers a .venv dir in the project's folder. It's small, but it means per-project venvs are slightly more ergonomic with uv. Wereas with conda, because they're shared under homedir, it's easy to get lazy once you have a working venv and reuse that good working venv across multiple programs, and then it breaks when one program needs its dependencies updated and now it's broken for all of them. Naturally that would never happen to a skilled conda operator, so I'll just say per-project uv venv creation and recreation flows just that tiny bit smoother, because I can just run "rm -rf .venv" and not worry about breaking other things. One annoyance I have with uv is that it really wants to use the latest version of python it knows about, and that version is too new for a program or one of its dependencies, and the program won't run. Running "uv venv --python 3.12" instead of"uv venv" isn't onerous, but it's annoying enough to mention. (pyproject.toml lets projects specify version requirements, but they're not always right.) Arguably that's a python issue and not uv's, but as users, we just want things to work, dammit. That's always the first thing I look for when things don't work.

As mentioned, with uv the project venv lives in .venv inside the project's directory which lets "uv run program.py" cheat. Who amongst us hasn't forgotten to "source .venv/bin/activate" and been confused when things "suddenly" stopped working. So if you're in the project directory, "uv run" will automatically use the project's .venv dir.

As far as it being pushed to promote rust. I'm sure there's a non-zero amount of people for whom that's true, but personally as that makes it harder to contribute to uv, it's actually a point against it. Sometimes I wonder how fast it would be if it was written in python using the same algorithms, but run under pypy.

Anyway, I wouldn't say any of that's revolutionary. Programs exist to translate between the different project file types (requirements.txt/environment.yml/pyproject.toml) so if you're already comfortable with conda and don't want to use uv, and you're not administering any shared system(s), I'd just stick the command to generate environment.yml from pyproject.toml on a cheat sheet somewhere.

---

One bug I ran into with one of the condas; I forgot which, is that it called out to pip under the hood in interactive mode and pip got stuck waiting for user input and that conda just sat there waiting for input that would never come. Forums were filled with reports by users talking about letting it run for hours or even days. I fixed that, but it soured me on *conda, unfortunately.

atoav|4 months ago

Started converting every repo over to uv. I had some weird and hard to deal with dependencies before. Every single one was easier to solve than before. It just works and is blazingly fast.

Absolute no-brainer.

pjmlp|4 months ago

I never got why.

I use Python since version 1.6, mainly for OS scripting, because I rather use something with JIT/AOT in the box for application software.

Still, having a little setup script to change environment variables for PYTHONPATH, PATH and a few other things, always did the trick.

Never got to spend hours tracking down problems caused by the multiple solutions that are supposed to solve Python's problems.

WhyNotHugo|4 months ago

uv is weird. It's like 5 entirely different tools mashed and entangled into one program.

Last I tried it, it insisted on downloading a dynamically linked Python and installing that. This obviously doesn't work, you can't distribute dynamically linked binaries for Linux and expect them to work on any distribution (I keep seeing this pattern and I guess it's because this typically works on macOS?).

Moreover my distribution already has a package manager which can install Python. I get that some absolute niche cases might need this functionality, but that should most definitely be a separate tool. The problem isn't just that the functionality is in the same binary, but also that it can get triggered when you're using another of its functionalities.

I wish this had been made into actual separate tools, where the useful ones can be adopted and the others ignored. And, most important, where the ecosystem can iterate on a single tool. Having "one tool that does 5 things" makes it really hard to iterate on a new tools that does only one of those things in a better way.

It's pretty disappointing to see the Python ecosystem move in this direction.

Balinares|4 months ago

Your distro's package manager cannot install arbitrary versions of Python such as might be required by a specific Python project and it cannot install anything at all for individual users without root access. These are two different tools that serve two different purposes.