I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.
uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.
for me the surprise is the pace? I’d expect people to be more set in their tools that it takes longer than a few months for a new tool, no matter how good, to become the majority use one. Though perhaps people adopt new tools more easily in CI where install times matter more
Is this like when everyone on here had already been saying Yarn was a no-brainer replacement for npm, having totally obsoleted it, for like two-plus years, but it was still lacking safety/sanity checks, missing features, and broke in bizarre ways on lots of packages in-the-wild?
Or is the superior replacement actually up to the job this time?
I’ll bite - I could care less about speed, that feels like a talking point I see often repeated despite other package managers not being particularly slow. Maybe there’s some workload I’m missing that this is more important for?
I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.
I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.
I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.
Started converting every repo over to uv. I had some weird and hard to deal with dependencies before. Every single one was easier to solve than before. It just works and is blazingly fast.
uv is weird. It's like 5 entirely different tools mashed and entangled into one program.
Last I tried it, it insisted on downloading a dynamically linked Python and installing that. This obviously doesn't work, you can't distribute dynamically linked binaries for Linux and expect them to work on any distribution (I keep seeing this pattern and I guess it's because this typically works on macOS?).
Moreover my distribution already has a package manager which can install Python. I get that some absolute niche cases might need this functionality, but that should most definitely be a separate tool. The problem isn't just that the functionality is in the same binary, but also that it can get triggered when you're using another of its functionalities.
I wish this had been made into actual separate tools, where the useful ones can be adopted and the others ignored. And, most important, where the ecosystem can iterate on a single tool. Having "one tool that does 5 things" makes it really hard to iterate on a new tools that does only one of those things in a better way.
It's pretty disappointing to see the Python ecosystem move in this direction.
As an outsider to python, I never got how a language who got popular for being simple, elegant and readable could end up with perhaps the most complex tooling situation (dependencies, envs, etc). Any time I glance at the community there seems to be a new way of doing things.
What caused python to go through these issues? Is there any fundamental design flaw ?
It's mostly about age. Python has been around for 35 years now. The first version of a Python package directory was the cheeseshop (Monthy Python reference) in 2003. The earliest version of a pip-like tool was "easy_install" which - I kid you not - worked by scraping the HTML listing page of the cheeseshop and downloading zip files linked from that!
More recent languages like Node.js and Rust and Go all got to create their packaging ecosystems learning from the experiences of Perl and Python before them.
There is one part of Python that I consider a design flaw when it comes to packaging: the sys.modules global dictionary means it's not at all easy in Python to install two versions of the same package at the same time. This makes it really tricky if you have dependency A and dependency B both of which themselves require different versions of dependency C.
Python 2.0 was released in October 2000. The Python ecosystem has witnessed several significant shifts in expectation as far as how software is built and delivered, from Slackware-style source builds to vendor packages to containers to uv just downloading a standalone binary archive. And the deadsnakes ppa and venvs, plus the ongoing awkwardness about whether pip should be writing stuff into usr/local or ~/.local or something else.
All of this alongside the rise of GitHub and free CI builders, it being trivial to depend on lots of other packages of unknown provenance, stdlib packages being completely sidelined by stuff like requests.
It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend, what a lock file is and how workspace management fits into the whole picture. Distutils and setuptools are in there too.
Basically, Python’s packaging has been a mess for a long time, but uv getting almost everything right all of a sudden isn’t an accident; it’s an abrupt gelling of ideas that have been in progress for two decades.
If you read the initial bbs post by Guido introducing Python he describes it mostly as an alternative to bash. Basically a really nice scripting language with a decent standard library. I don’t think it was designed from the start to end up where it has. He created a genius syntax that people love.
1. Age; there are absurd amounts of legacy cruft. Every time you have a better idea about how to do things, you have to agonize over whether you'll be allowed to remove the old way. And then using the old ways ends up indirectly causing problems for people using the new ways.
2. There is tons of code in the Python ecosystem not written in Python. One of the most popular packages, NumPy, depends on dozens of megabytes of statically compiled C and Fortran code.
3. Age again; things were designed in an era before the modern conception of a "software ecosystem", so there was nobody imagining that one day you'd be automatically fetching all the transitive dependencies and trying to build them locally, perhaps using build systems that you'd also fetch automatically.
4. GvR didn't seem to appreciate the problem fully in the early 2010s, which is where Conda came from.
5. Age again. Old designs overlooked some security issues and bootstrapping issues (this ties into all the previous points); in particular, it was (and still is) accepted that because you can include code in any language and all sorts of weird build processes, the "build the package locally" machinery needs to run arbitrary code. But that same system was then considered acceptable for pure-Python packages for many years, and the arbitrary code was even used to define metadata. And in that code, you were expected to be able to use some functionality provided by a build system written in Python, e.g. in order to locate and operate a compiler. Which then caused bootstrapping problems, because you couldn't assume that your users had a compatible version of the main build system (Setuptools) installed, and it had to be installed in the same environment as the target for package installation. So you also didn't get build isolation, etc. It was a giant mess.
5a. So they invented a system (using pyproject.toml) that would address all those problems, and also allow for competition from other build back-ends. But the other build back-end authors mostly wanted to make all-in-one tools (like Poetry, and now, er, uv); and meanwhile it was important to keep compatibility, so a bunch of defaults were chosen that enabled legacy behaviour — and ended up giving old packages little to no reason to fix anything. Oh, and also they released the specification for the "choose the build back-end system" and "here's how installers and build back-ends communicate" years before the specification for "human-friendly input for the package metadata system".
Dependency management has always felt complicated. However, environment management I think is actually way simpler than people realize. Python basically just walks up directories trying to find its packages dir. A python "env" is just a copy of the python binary in its own directory. That's pretty much it. Basically all difficulties I've ever had with Python environments have been straightened out by going back to that basic understanding. I feel like the narrative about virtualenvs has always seemed scary but the reality really isn't.
BDFL left a long time ago. It’s not opinionated anymore. The language went from being small enough to fit in that guy’s head to a language controlled by committee that’s trying to please everyone.
I'm at the point where I don't touch python without uv at all, if possible. The only bad is, now I want to use uv to install go and java and debian packages too ... :(
The ability to get random github project working without messing with system is finally making python not scary to use.
1. Write code that crosses a certain complexity treshold. Let's say tou also need compiled wheels for a performance critical section of a library that was written in Rust, have some non-public dependencies on a company-internal got server
2. Try deploying said code on a fleet of servers whose version and exact operating system versions (and python versions!) are totally out of your control. Bonus points for when
your users need to install it themselves
3. Wait for the people to contact you
4. Now do monthly updates on their servers while updating dependencies for your python program
If that was never your situation, congrats on your luck, but that just means you really weren't in a situation where the strengths of uv had played out. I had to wrestle with this for years.
This is where uv shines. Install uv, run with uv. Everything else just works, including getting the correct python binary, downloading the correct wheel, downloading dependencies from the non-public git repo (provided the access has been given), ensuring the updates go fine, etc.
TBH I feel the same. And for development on my laptop, that seems fine. For the Python package I'm working on how, a single run of pytest takes less than five seconds.
Where things get annoying is when I push to GitHub and Tox runs through GitHub Actions. I've set up parallel runs for each Python version, but the "Prepare Tox" step (which is where Python packages are downloaded & installed) can take up to 3 minutes, where the "Run Tox" step (which is where pytest runs) takes 1½ minutes.
GitHub Actions has a much better network connection than me, but the free worker VMs are much slower. That is where I would look at making a change, continuing to use pip locally but using uv in GitHub Actions.
Run a program should never ever require more than a single simple run command.
If your project requires creating an env and switching to shit and then running it’s a bad program and you should feel bad.
Quite frankly the fact that Python requires explaining and understanding a virtual environment is an embarrassing failure.
uv run foo.py
I never ever want running any python program to ever require more than that. And it better work first time 100%. No missing dependencies errors are ever permitted.
Also, Conda can fucking die in a fire. I wil never ever ever install conda or mini-conda onto my system ever again. Keep those abominations away.
If it sold itself on its merits I don’t think we’d see all these fawning posts about it. It’s a Rust fan thing. You can see how any criticism gets treated. I’m sure it works for some people and obviously if it does, then great. But it’s got this same weird cult following and pretend talk of speed that lots of Rust stuff has. It’s getting a little tiring. If you like it, use it, evangelizing is obnoxious.
It sounds like there are many Python users who have acclimated to the situation of needing three or more tools to work with Python and do not see the benefit or value of being able to do this all with one potentially faster tool.
While I understand that some have acclimated well to the prior situation and see no need to change their methods, is there really no objective self-awareness that perhaps having one fast tool over many tools may be objectively better?
I feel like the new terminology matches what it's doing better, though. You don't install things anymore, uv just makes the state of the world match what you asked for.
it’s my 1st attempt at reporting on CI downloads specifically. Interpreting this is more of an art than a science, I’d love to hear if others have ideas on what to do with this data!
I literally stopped writing Python for scripting a year ago - the distribution story was too painful. With LLMs, there's not much a dynamic language offers over something like Go even for quick scripting.
Also, on a new machine, I could never remember how to install the latest version of Python without fiddling for a while. uv solves the problem of both installation and distribution. So executing `uv run script.py` is kind of delightful now.
Uv and ruff are near feature complete and open source. It's very likely they'll survive in one way or the other and are already better than the tools they're meant to replace.
uv still has some issues, it cannot pull from global installations like pip, so on termux, something like tree sitter cannot be installed, because tree sitter is provided by apt/pkg
UV is super fast and great for environment management, however it's not at all well suited to a containerised environment, unless I'm missing something fundamental (unless you like using an env in your container that is).
uv works great in a container, you can tell it to skip creating a venv and use the system's version of Python (in this case, let's say Python 3.14 from the official Python image on the Docker Hub).
The biggest wins are speed and a dependable lock file. Dependencies get installed ~10x faster than with pip, at least on my machine.
> unless you like using an env in your container that is
A virtual environment, minimally, is a folder hierarchy and a pyvenv.cfg file with a few lines of plain text. (Generally they also contain a few dozen kilobytes of activation scripts that aren't really necessary here.) If you're willing to incur the overhead of using a container image in the first place, plus the ~35 megabyte compiled uv executable, what does a venv matter?
I haven’t really had this issue. UV’s recommendation is to mount the uv.lock and install those manages package versions to the container’s global pip environment. We haven’t had much issue at my work, where we use this to auto-manage python developer’s execution environments at scale.
bognition|4 months ago
Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.
Uv is lightning fast and damn easy to use. It’s so functional and simple.
anitil|4 months ago
ziml77|4 months ago
I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.
uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.
hk1337|4 months ago
ThibWeb|4 months ago
walkabout|4 months ago
Or is the superior replacement actually up to the job this time?
andy99|4 months ago
I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.
I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.
I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.
atoav|4 months ago
Absolute no-brainer.
pjmlp|4 months ago
I use Python since version 1.6, mainly for OS scripting, because I rather use something with JIT/AOT in the box for application software.
Still, having a little setup script to change environment variables for PYTHONPATH, PATH and a few other things, always did the trick.
Never got to spend hours tracking down problems caused by the multiple solutions that are supposed to solve Python's problems.
WhyNotHugo|4 months ago
Last I tried it, it insisted on downloading a dynamically linked Python and installing that. This obviously doesn't work, you can't distribute dynamically linked binaries for Linux and expect them to work on any distribution (I keep seeing this pattern and I guess it's because this typically works on macOS?).
Moreover my distribution already has a package manager which can install Python. I get that some absolute niche cases might need this functionality, but that should most definitely be a separate tool. The problem isn't just that the functionality is in the same binary, but also that it can get triggered when you're using another of its functionalities.
I wish this had been made into actual separate tools, where the useful ones can be adopted and the others ignored. And, most important, where the ecosystem can iterate on a single tool. Having "one tool that does 5 things" makes it really hard to iterate on a new tools that does only one of those things in a better way.
It's pretty disappointing to see the Python ecosystem move in this direction.
kace91|4 months ago
What caused python to go through these issues? Is there any fundamental design flaw ?
simonw|4 months ago
More recent languages like Node.js and Rust and Go all got to create their packaging ecosystems learning from the experiences of Perl and Python before them.
There is one part of Python that I consider a design flaw when it comes to packaging: the sys.modules global dictionary means it's not at all easy in Python to install two versions of the same package at the same time. This makes it really tricky if you have dependency A and dependency B both of which themselves require different versions of dependency C.
mikepurvis|4 months ago
All of this alongside the rise of GitHub and free CI builders, it being trivial to depend on lots of other packages of unknown provenance, stdlib packages being completely sidelined by stuff like requests.
It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend, what a lock file is and how workspace management fits into the whole picture. Distutils and setuptools are in there too.
Basically, Python’s packaging has been a mess for a long time, but uv getting almost everything right all of a sudden isn’t an accident; it’s an abrupt gelling of ideas that have been in progress for two decades.
WD-42|4 months ago
zahlman|4 months ago
2. There is tons of code in the Python ecosystem not written in Python. One of the most popular packages, NumPy, depends on dozens of megabytes of statically compiled C and Fortran code.
3. Age again; things were designed in an era before the modern conception of a "software ecosystem", so there was nobody imagining that one day you'd be automatically fetching all the transitive dependencies and trying to build them locally, perhaps using build systems that you'd also fetch automatically.
4. GvR didn't seem to appreciate the problem fully in the early 2010s, which is where Conda came from.
5. Age again. Old designs overlooked some security issues and bootstrapping issues (this ties into all the previous points); in particular, it was (and still is) accepted that because you can include code in any language and all sorts of weird build processes, the "build the package locally" machinery needs to run arbitrary code. But that same system was then considered acceptable for pure-Python packages for many years, and the arbitrary code was even used to define metadata. And in that code, you were expected to be able to use some functionality provided by a build system written in Python, e.g. in order to locate and operate a compiler. Which then caused bootstrapping problems, because you couldn't assume that your users had a compatible version of the main build system (Setuptools) installed, and it had to be installed in the same environment as the target for package installation. So you also didn't get build isolation, etc. It was a giant mess.
5a. So they invented a system (using pyproject.toml) that would address all those problems, and also allow for competition from other build back-ends. But the other build back-end authors mostly wanted to make all-in-one tools (like Poetry, and now, er, uv); and meanwhile it was important to keep compatibility, so a bunch of defaults were chosen that enabled legacy behaviour — and ended up giving old packages little to no reason to fix anything. Oh, and also they released the specification for the "choose the build back-end system" and "here's how installers and build back-ends communicate" years before the specification for "human-friendly input for the package metadata system".
davesque|4 months ago
morshu9001|4 months ago
Funny thing is that decision was for modularity, but uv didn't even reuse pip.
pansa2|4 months ago
Given that, plus the breadth and complexity of its ecosystem, it makes sense that its tooling is also complex.
cgearhart|4 months ago
nomel|4 months ago
easy_install never even made it to 1.0
Still, not bad for a bunch of mostly unpaid volunteers.
lvl155|4 months ago
dgfitz|4 months ago
112233|4 months ago
The ability to get random github project working without messing with system is finally making python not scary to use.
icar|4 months ago
mise use -g go@1.24
mise use -g java@latest
mise use -g github:BurntSushi/ripgrep
[0]: https://mise.jdx.dev/
droelf|4 months ago
It gives you cross-platform binary packages, quickly (also written in Rust).
Alir3z4|4 months ago
Rarely I'd need a different version of python, in case I do, either I let the IDE to take care of it or just do pyenv.
I know there's the argument of being fast with uv, but most of the time, the actual downloading is the slowest part.
I'm not sure how big a project should be, before I feel pip is slow for me.
Currently, I have a project with around 50 direct dependencies and everything is installed in less than a min with a fresh venv and without pip cache.
Also, if I ever, ever needed lock files stuff, I use pipx. Never needed the hash of the packages the way it's done in package-lock.json.
Maybe, I'm just not the target audience of uv.
gooodvibes|4 months ago
Even if you only change your commands to 'uv venv ...' and 'uv pip install ...' and keep the rest of your workflow, you'll get
1. Much faster installs.
2. The option to specify the python version in the venv creation instead of having to manage multiple Python versions in some other way.
No pyproject.toml, no new commands to learn. It still seems like a win to me.
atoav|4 months ago
1. Write code that crosses a certain complexity treshold. Let's say tou also need compiled wheels for a performance critical section of a library that was written in Rust, have some non-public dependencies on a company-internal got server
2. Try deploying said code on a fleet of servers whose version and exact operating system versions (and python versions!) are totally out of your control. Bonus points for when your users need to install it themselves
3. Wait for the people to contact you
4. Now do monthly updates on their servers while updating dependencies for your python program
If that was never your situation, congrats on your luck, but that just means you really weren't in a situation where the strengths of uv had played out. I had to wrestle with this for years.
This is where uv shines. Install uv, run with uv. Everything else just works, including getting the correct python binary, downloading the correct wheel, downloading dependencies from the non-public git repo (provided the access has been given), ensuring the updates go fine, etc.
CaliforniaKarl|4 months ago
Where things get annoying is when I push to GitHub and Tox runs through GitHub Actions. I've set up parallel runs for each Python version, but the "Prepare Tox" step (which is where Python packages are downloaded & installed) can take up to 3 minutes, where the "Run Tox" step (which is where pytest runs) takes 1½ minutes.
GitHub Actions has a much better network connection than me, but the free worker VMs are much slower. That is where I would look at making a change, continuing to use pip locally but using uv in GitHub Actions.
unknown|4 months ago
[deleted]
morshu9001|4 months ago
forrestthewoods|4 months ago
If your project requires creating an env and switching to shit and then running it’s a bad program and you should feel bad.
Quite frankly the fact that Python requires explaining and understanding a virtual environment is an embarrassing failure.
uv run foo.py
I never ever want running any python program to ever require more than that. And it better work first time 100%. No missing dependencies errors are ever permitted.
Also, Conda can fucking die in a fire. I wil never ever ever install conda or mini-conda onto my system ever again. Keep those abominations away.
babl-yc|4 months ago
andy99|4 months ago
markkitti|4 months ago
While I understand that some have acclimated well to the prior situation and see no need to change their methods, is there really no objective self-awareness that perhaps having one fast tool over many tools may be objectively better?
ai-christianson|4 months ago
JoBrad|4 months ago
`uv install` = `uv sync`
`uv install rich` = `uv add rich`
saagarjha|4 months ago
drcongo|4 months ago
ThibWeb|4 months ago
rednafi|4 months ago
Also, on a new machine, I could never remember how to install the latest version of Python without fiddling for a while. uv solves the problem of both installation and distribution. So executing `uv run script.py` is kind of delightful now.
adfm|4 months ago
drcongo|4 months ago
0xpgm|4 months ago
Before that, I wouldn't want to be too dependent on it.
mixmastamyk|4 months ago
drcongo|4 months ago
make3|4 months ago
NSPG911|4 months ago
wishitwerentso|4 months ago
gatvol|4 months ago
nickjj|4 months ago
The biggest wins are speed and a dependable lock file. Dependencies get installed ~10x faster than with pip, at least on my machine.
Both of my Docker Compose starter app examples for https://github.com/nickjj/docker-flask-example and https://github.com/nickjj/docker-django-example use uv.
I also wrote about making the switch here: https://nickjanetakis.com/blog/switching-pip-to-uv-in-a-dock...
zahlman|4 months ago
A virtual environment, minimally, is a folder hierarchy and a pyvenv.cfg file with a few lines of plain text. (Generally they also contain a few dozen kilobytes of activation scripts that aren't really necessary here.) If you're willing to incur the overhead of using a container image in the first place, plus the ~35 megabyte compiled uv executable, what does a venv matter?
scuff3d|4 months ago
bognition|4 months ago
In my docker files I use `uv sync` to install deps vs `pip install -f requirements.txt`
And then set my command to `uv run my_command.py` vs calling Python directly.
amingilani|4 months ago
Could you elaborate?
__float|4 months ago
coeneedell|4 months ago
nomel|4 months ago
And lack of non-local venv support [2].
[1] https://github.com/astral-sh/uv/issues/10203
[2] https://github.com/astral-sh/uv/issues/1495
diath|4 months ago
What's the problem with that?
You just make your script's entry point be something like this:
nicwolff|4 months ago
whalesalad|4 months ago
scuff3d|4 months ago
hansonkd|4 months ago
aaronbrethorst|4 months ago
bkettle|4 months ago
unknown|4 months ago
[deleted]
sieabahlpark|4 months ago
[deleted]