I recently switched to uv, and I cannot praise it enough. With uv, the Python ecosystem finally feels mature and polished rather than like a collection of brittle hacks.
Kudos to the uv developers for creating such an amazing piece of software!
Yeah, switched to writing python professionally ~4 years ago, and been low key hating the ecosystem. From a java and javascript background, it's mostly been npm/mvn install and it "just works". With python, there's always someone being onboarded that can't get it to work. So many small issues. Have to have the correct version per project, then have to get the venv running. And then installing it needs to build stuff because there's no wheel, so need to set up a complete c++ and rust toolchain etc., just to pull a small project and run it.
uv doesn't solve all this, but it's reduced the amount of ways things can go wrong by a lot. And it being fast means that the feedback-loop is much quicker.
Not a surprise. I said it before and I'll say it again, all the competing projects should just shut up shop for the good of Python. uv is so much better it's like pushing penny farthings after the safety bike has been invented.
That's rough for all the creators of poetry, pdm, pipenv, etc. to hear. They put in a ton of great work over the last decade, but I fear you may be right.
I've read so much positive feedback about uv, that I'd really like to use it, but I'm unsure if it fits my needs.
I was heavily invested into virtualenv until I had to upgrade OS versions, which upgraded the Python versions and therefore broke the venvs.
I tried to solve this by using pyenv, but the need of recompiling Python on every patch wasn't something which I would accept, specially in regards to boards like Raspberry Pis.
Then I tried miniconda which I initially only liked because of the precompiled Python binaries, and ultimately ended up using pyenv-managed miniforge so that I could run multiple "instances" of miniforge and therefore upgrade miniforge gradually.
Pyenv also has a plugin which allows to set suffixes to environments, which allows me to have multiple miniforges of the same version in different locations, like miniforge-home and miniforge-media, where -home has all files in the home dir and -media has all files on a mounted nvme, which then is where I put projects with huge dependencies like CUDA inside, not cluttering home, which is contained in a VM image.
It works really great, Jupyter and vscode can use them as kernels/interpreters, and it is fully independent of the OS's Python, so that OS upgrades (22.04 -> 24.04) are no longer an issue.
But I'm reading about all these benefits of uv and wish I could use it, but somehow my setup seems to have tied my hands. I think I can't use uv in my projects.
Any recommendations?
Edit: Many of my projects share the same environment, this is absolutely normal for me. I only create a new environment if I know that it will be so complex that it might break things in existing environments.
I’m a bit confused why uv is not an option for you. You don’t need to compile Python, it manages virtualenvs for you, you can use them with Jupyter and vscode. What are you missing?
Have you checked out https://github.com/prefix-dev/pixi? It's built by the folks who developed Mamba (a faster Conda implementation). It supports PyPI dependencies using UV, offers first-class support for multi-envs and lockfiles, and can be used to manage other system dependencies like CUDA. Their CLI also embraces much of the UX of UV and other modern dependency management tools in general.
I keep reading praise about uv, and every single time I never really understand what problems it addresses.
I've got a couple quite big Django projects for which I've used venv for years, and not once have I had any significant issues with it. Speed at times could have been better and I would have liked to have a full dependency list lock file, but that never caused me issues.
The only thing that comes to mind is those random fails to build of C/C++ dependencies. Does uv address this? I've always seen people rave about other benefits.
The benefit that uv adds is it's a one-stop-shop that's also wicked fast.
If you use venv then you have extra steps because you have to explicitly create the venv, then explicitly install the deps there with pip. If your project is designed for a specific python version then developers have to manage that separately (usually pyenv these days).
For people building apps uv replaces venv, pip and pyenv, while being way faster at doing all three of those (you can completely rebuild the virtualenv and install the dependencies from scratch in under a second usually because uv is faster at creating a virtualenv than venv and is very quick at relinking the dependencies from a package cache).
What makes it so great for me is the effortlessness.
I often use Python for quick one off scripts. With UV I can just do `uv init`, `uv add` to add dependencies, and `uv run` whatever script I am working on. I am up and running in under a minute. I also feel confident that the setup isn't going to randomly break in a few weeks.
With most other solutions I have tried in the Python ecosystem, it always seemed significantly more brittle. It felt more like a collection of hacks than anything else.
I'm in the same boat. Sure it's nice and better, but I haven't felt so much annoyance with the python ecosystem that I desperately need something better. I use VS Code and it takes care of venv automatically, so I am biased by that.
As an aside, I can't praise the Wagtail CMS highly enough. It sets a high bar for usability and accessibility of the auto-generated content management UI.
The developer experience is top notch with excellent documentation and many common concerns already handled by Wagtail or Django. A significant amount of Wagtail-specific code is declarative, essentially describing data model, relationships, and UI fields. The parts that you don't need stay out of the way. It's also agnostic of the type of front-end you want, with full and automatic support for headless mode with JavaScript client, using traditional Django templates SSR, or using a dynamic approach like HTMX.
ty! We have no plans to rewrite Wagtail in Rust but I hope there’s ways in which we can make the developer experience better, particularly around dependencies management
PyCharm also added uv support in their latest versions.
We recently switched to PDM in our company because it worked very well in our tests with different package/dependency managers. Now I am rethinking if we should switch to uv while PDM usage is still not very wide-spread in our company. But PDM works very well, so I am not sure whether to keep using it.
With the caveat I only have the package installers usage data for Wagtail downloads – pdm usage has fallen off a cliff, from 0.2% of downloads in January 2024, to 0.01% in January 2025. Roughly matches the uptake of uv.
Doesn’t make pdm bad in itself but that means there’ll be fewer pdm users around to report bugs, potentially fewer contributors to it too, fewer resources, etc.
Back when PDM was still pushing __pypackages__ for standardisation I think PDM made sense, but honestly I don't think it adds anything over uv and is just going to be slower for the most part.
As much as I am glad that it looks like one solution is being more and more accepted as the golden standard, I'm a little disappointed that PDM [0] -- which has been offering pretty much everything uv does for quite some time now -- has been completely overlooked. :(
- uv is aware of your dependencies, you can add/remove development dependencies, create group of development dependencies (test, lint, dev, etc) and add or remove those and only those at will. You can add dependencies and optional dependencies for a project as well, think my_app[cli,standard]. You don't need to have different requirements.txt for each case nor do you need to remove things by hand as you'd do in pip, since it doesn't remove deps when you remove a package for example. As a result, you can remove {conda,poetry,...} from your workflows.
- uv can install python and a virtualenv for you. Any command you run with `uv run` from the root of a repo will be aware of its environment, you don't even need to activate a virtualenv anymore. This replaces {pyenv, pyenv-virtualenv, virtualenvwrapper,...}.
- uv follows the PEPs for project config (dependencies, optional dependencies, tool configs) in the pyproject.toml so in case uv dies, it's possible to migrate away for the features are defined in the PEPs. Which is not the case for say, poetry.
- uv has a lock file and it's possible to make deps platform specific (Windows, Linux, MacOS, etc). This is in compliance with a PEP but not supported by all tools.
- uv supports custom indexes for packages so you can prefer a certain index, for example your company package index or pytorch's own index (for ML work).
- very fast, makes local dev very seamless and is really helpful in CI/CD where you might just setup and tear down python envs a lot.
Also, the team is responsive on Github so it's easy to get help.
Not only it's faster, it also provides a lock file, `uvx tool_name` just like `npx`, and a comprehensive set of tools to manage your Python version, your venv and your project.
You don't need `pyenv`, `poetry` and `pipx` anymore, `uv` does all of that for you.
It's a much more complete tool than pip. If you've used poetry, or (in other languages) cargo, bundler, maven, then it's like that (and faster than poetry).
If you haven't, in addition to installing dependencies it will manage and lock their versions (no requirements.txt, and much more robust), look after the environment (no venv step), hold your hand creating projects, and probably other things.
Edit to add: the one thing it won't do is replace conda et al, nor is it intended to.
The problems start as soon as your scripts should run on more than your own computer.
If you pip install something, you install it on the system python (the python binary located at sys.executable). This can break systems if the wrong combination of dependencies comes together. This is why you should never install things via pip for other people, unless you asked them first.
Now how else would you install them? There is a thing called virtual environments, which basically allows you to install pip dependencies in such way, they are only there within the context of the virtual environment. This is what you should do when you distribute python programs.
Now the problem is how do you ensure that this install to the virtual environment uses specific versions? What happens when one library depends on package A with version 1.0 and another library depends on a package with version 2.0? Now what happens if you deploy that to an old debian with an older python version.. Before uv I had to spend literal days to resolve such conflicts.
uv solves most of these problems in one unified place, is extremely performant, just works and when it does not, it tells you precisely why.
It brings way more to the table than just being fast, like people are commenting. E.g. it manages Python for your projects, so if you say you want Python 3.12 in your project, and then you do 'uv run python my script.py', it will fetch and run the version of Python you specified, which pip can't do. It also creates lock files, so you know the exact set of Python package dependencies that worked, while you specify them more loosely. Plus a bunch of other stuff..
The only advantage over pip is it's faster. But the downside is it's not written in Python.
The real point of uv is to be more than pip, though. It can manage projects, so basically CLI commands to edit your `pyproject.toml`, update a lockfile, and your venv all in one go. Unlike earlier tools it implements a pretty natural workflow on top of existing standards where possible, but for some things there are no standards, the most obvious being lockfiles. Earlier tools used "requirements.txt" for this which was quite lacking. uv's lockfile is cross-platform, although, admittedly does produce noisier diffs than requirements.txt, which is a shame.
I just taught a week long course, Advanced Python for Data Scientists. The first day we discussed how to use uv. The feedback was "this UV content is worth the price of the whole course".
Using uv is an easy sell to anyone who has worked with Python.
I feel for me, at least one nice thing about poetry over uv is, that if I have an issue or feature extension, I can just write my own plugin in pure Python. With uv, I'd need to learn Rust in addition to python/c/c++/etc.
I wonder what it would take to get poetry on par with uv for those who are already switching to it? Poetry is definitely very slow downloading multiple versions of packages to determine dependencies (not sure how uv works around this?). Does uv have a better dependency checker algorithm?
In this day and age you don't usually have to download the packages to resolve the dependencies as PyPI can usually expose it (unless you need to install from sdist which is less common these days).
Dependency resolution is slow because it's computationally very expensive. Because uv is written in Rust the resolution is just much much faster. IIRC they actually reuse the same resolution package that Cargo (Rust's package manager) uses.
I wonder what it would take to get poetry on par with uv for those who are already switching to it?
Poetry and uv have quite different philosophy. Poetry is incredibly opinionated in how you should do things, and trying to make poetry fit an existing project or combining poetry with other tools is quite likely to break. Uv on the hand is far more flexible and easier to make work with your current workflow. For me that was the main reason I gave up poetry, and in that aspect poetry will probably never be 'on par' with uv since these aren't technical differences, but differences of philosophy.
Well, seems like 100% what’s going to happen (for the majority of Wagtail users at least) if the current trend continues. I’m not sure if that’s a good thing to be frank. But we’ll have to adjust regardless.
Guess people here don't talk much about cargo. I wouldn't be surprised to learn that cargo inspired uv. Rust with cargo showed for the first time that tooling _can_ be good, even for systems programming languages.
With good reason honestly. They take all the best practices from existing tooling we had, discard the bad, and make it run blazingly fast.
Ruff for me meant i could turn 4 pre-commit hooks (which you have to configure to be compatible with each other too) into just 1, and i no longer dread the "run Pylint and take a coffee break" moment.
I jumped ship to UV recently. Though i was skeptical at first i don't regret it. It makes dependency management less of a chore, and just something i can quickly do now. Switching from Poetry was easy for me too, only package i had issues with was pytorch, but that just required some different toml syntax.
I'm extremely satisfied with Pixi. It fixes almost all the issues I had with conda and mamba. It supports both conda and pypi (via uv) packages. I don't know if uv fixes pip's dependency management hell. I settled on conda packages because pip was such a mess.
I recently checked out UV, and it's impressively fast. However, one challenge that keeps coming up is handling anything related to CUDA and Torch.
Last week, I started developing directly in PyTorch containers using just pip and Docker. With GPU forwarding on Windows no longer being such a hassle, I'm really enjoying the setup. Still, I can’t shake the feeling that I might be overlooking something critical.
I’d love to hear what the HN crowd thinks about this type of env.
If the platform (OS) solution works for you that's probably the easiest. It doesn't for me because I work on multiple Linux boxes with differing GPUs/CUDAs. So I've use the optional dependencies solution and it's mostly workable but with an annoyance that uv sync forgets the --extra that have been applied in the venv so that if you "uv add" something it will uninstall the installed torch and install the wrong one until I re-run uv sync with the correct --extra again. (uv add with --extra does something different) And honestly I appreciate not having hidden venv states but it is a bit grating.
There are some ways to setup machine/user specific overrides with machine and user uv.toml configuration files.
That feels like it might help but I haven't figured out how to configure get that to help it pick/hint the correct torch flavor for each machine. Similar issues with paddlepaddle.
Honestly I just want an extras.lock at this point but that feels like too much of a hack for uv maintainers to support.
I have been pondering whether nesting uv projects might help so that I don't actually build venvs of the main code directly and the wrapper depends specifically on certain extras of the wrapped projects. But I haven't looked into this yet. I'll try that after giving up on uv.toml attempts.
Been using Python for 20 years and tried just about every tool related to packaging over the years. The only ones that worked well (IMO) were pip, pip-tools and venv. uv finally replaces all of them.
But being written in Rust means I'm having to also finally get somewhat proficient in Rust. Can any Rust developers comment on the quality of the uv codebase? I find it surprisingly procedural in style, with stuff like hundreds of `if dry_run` type things scattered throughout the codebase, for example. Is this normal Rust style?
I switched from Poetry to uv last year. I like the speed and how it stores virtual envs in a .venv directory along with the project by default, whereas Poetry store it in a separate directory in your home directory by default, which makes it hard to work with tools that only discover virtual envs in the project root.
uv tool is also a great replacement for pipx.
I think it's the way to go for Python dependency management in 2025.
Poetry and uv both offer better dependency management, and project isolation out of the box. If you work in a team, or on more than one python project, then it's worth spending a day to install & adopt either one of these systems.
I don't understand the chart, does it say wagtail suddenly had a lot more uv traffic, but pip and poetry did not drop much? what does that mean? new batch of users emerges using uv? behavior of new uv version disrupted the chart?
I’ve been primarily a Python developer since 2012 and recently switched to uv. The ability to manage dependencies, venv, and multiple Python versions makes it best-in-class now. It really is a fantastic tool.
uv, ruff.... Astral doesn't miss. Excited to see what else they can bring to the Python world.
Had an issue running something on the latest Python version installed (3.13) but only needed to 'downgrade' to 3.11 to run that particular script. Just a simple:
`uv run --python 3.11 --with openai-whisper transcribe.py`
Biggest issue I have, is not solving all the dependency hell that is Python with its unversioned libraries, but supply chain attacks. Also regressions introduced by new versions all the time.
That is why for projects I resolve everything by hand, add all coarsely audited 3rd party libraries to ./lib/, and the main entry file then does this:
#!/usr/bin/env -S /bin/sh -c "_top_dir=\"\$(dirname \"\$(realpath -s \"\$0\")\")\"; cd \"\$_top_dir\"; exec \"\$_top_dir/python/install/bin/python3\" -W once::DeprecationWarning -X dev \"\$0\" \"\$@\""
I like the excellent standalone CPython by indygreg, now under astral-sh's github organization. Unpack as is into ./python/ and done. Because Arch Linux would just roll forward to whatever version is latest, introducing new warnings every month or two and havoc on any new major version.
Project is fully portable, copy anywhere that has glibc, run.
nah other than uv it's just poetry, pdm, and pipenv over last decade, and uv is so dominant I don't think anyone else will try making another one for a while
uv sounds great! For those still using Python v2, how well does it work? pip used to be a pain when having to manage both Python v2 and v3 projects and tools.
Unfortunately, I don't think many things nowadays are tested with a 15-year-old version of a language.
I was one of the last holdouts, preferring to keep 2.7 support if it wasn't too much hassle, but we have to move on at some point. Fifteen years is long enough support.
heisig|11 months ago
Kudos to the uv developers for creating such an amazing piece of software!
matsemann|11 months ago
uv doesn't solve all this, but it's reduced the amount of ways things can go wrong by a lot. And it being fast means that the feedback-loop is much quicker.
Wulfheart|11 months ago
ffsm8|11 months ago
Python is going through package managers like JS goes through trends like classes-everywhere, hooks, signals etc
amelius|11 months ago
And what if there are no binaries yet for my architecture, will it compile them, including all the dependencies written in C?
freeamz|11 months ago
IshKebab|11 months ago
nikisweeting|11 months ago
qwertox|11 months ago
I was heavily invested into virtualenv until I had to upgrade OS versions, which upgraded the Python versions and therefore broke the venvs.
I tried to solve this by using pyenv, but the need of recompiling Python on every patch wasn't something which I would accept, specially in regards to boards like Raspberry Pis.
Then I tried miniconda which I initially only liked because of the precompiled Python binaries, and ultimately ended up using pyenv-managed miniforge so that I could run multiple "instances" of miniforge and therefore upgrade miniforge gradually.
Pyenv also has a plugin which allows to set suffixes to environments, which allows me to have multiple miniforges of the same version in different locations, like miniforge-home and miniforge-media, where -home has all files in the home dir and -media has all files on a mounted nvme, which then is where I put projects with huge dependencies like CUDA inside, not cluttering home, which is contained in a VM image.
It works really great, Jupyter and vscode can use them as kernels/interpreters, and it is fully independent of the OS's Python, so that OS upgrades (22.04 -> 24.04) are no longer an issue.
But I'm reading about all these benefits of uv and wish I could use it, but somehow my setup seems to have tied my hands. I think I can't use uv in my projects.
Any recommendations?
Edit: Many of my projects share the same environment, this is absolutely normal for me. I only create a new environment if I know that it will be so complex that it might break things in existing environments.
the_mitsuhiko|11 months ago
be7a|11 months ago
datadeft|11 months ago
- uv init new-py-env
- cd new-py-env
- uv add jupyter
- uv build
These are executed super fast. Not sure if this could help your situation but it is worth to be aware of these.
secondcoming|11 months ago
mihaic|11 months ago
I've got a couple quite big Django projects for which I've used venv for years, and not once have I had any significant issues with it. Speed at times could have been better and I would have liked to have a full dependency list lock file, but that never caused me issues.
The only thing that comes to mind is those random fails to build of C/C++ dependencies. Does uv address this? I've always seen people rave about other benefits.
chippiewill|11 months ago
If you use venv then you have extra steps because you have to explicitly create the venv, then explicitly install the deps there with pip. If your project is designed for a specific python version then developers have to manage that separately (usually pyenv these days).
For people building apps uv replaces venv, pip and pyenv, while being way faster at doing all three of those (you can completely rebuild the virtualenv and install the dependencies from scratch in under a second usually because uv is faster at creating a virtualenv than venv and is very quick at relinking the dependencies from a package cache).
hansihe|11 months ago
I often use Python for quick one off scripts. With UV I can just do `uv init`, `uv add` to add dependencies, and `uv run` whatever script I am working on. I am up and running in under a minute. I also feel confident that the setup isn't going to randomly break in a few weeks.
With most other solutions I have tried in the Python ecosystem, it always seemed significantly more brittle. It felt more like a collection of hacks than anything else.
ashikns|11 months ago
brylie|11 months ago
The developer experience is top notch with excellent documentation and many common concerns already handled by Wagtail or Django. A significant amount of Wagtail-specific code is declarative, essentially describing data model, relationships, and UI fields. The parts that you don't need stay out of the way. It's also agnostic of the type of front-end you want, with full and automatic support for headless mode with JavaScript client, using traditional Django templates SSR, or using a dynamic approach like HTMX.
Kudos to the Wagtail team!
ThibWeb|11 months ago
ZuLuuuuuu|11 months ago
We recently switched to PDM in our company because it worked very well in our tests with different package/dependency managers. Now I am rethinking if we should switch to uv while PDM usage is still not very wide-spread in our company. But PDM works very well, so I am not sure whether to keep using it.
ThibWeb|11 months ago
Doesn’t make pdm bad in itself but that means there’ll be fewer pdm users around to report bugs, potentially fewer contributors to it too, fewer resources, etc.
chippiewill|11 months ago
BerislavLopac|11 months ago
[0] https://pdm-project.org
porridgeraisin|11 months ago
https://pdm-project.org/en/latest/usage/uv/
TOMDM|11 months ago
I've been working with pip for so long now that I barely notice it unless something goes very wrong.
NeutralForest|11 months ago
- uv can install python and a virtualenv for you. Any command you run with `uv run` from the root of a repo will be aware of its environment, you don't even need to activate a virtualenv anymore. This replaces {pyenv, pyenv-virtualenv, virtualenvwrapper,...}.
- uv follows the PEPs for project config (dependencies, optional dependencies, tool configs) in the pyproject.toml so in case uv dies, it's possible to migrate away for the features are defined in the PEPs. Which is not the case for say, poetry.
- uv has a lock file and it's possible to make deps platform specific (Windows, Linux, MacOS, etc). This is in compliance with a PEP but not supported by all tools.
- uv supports custom indexes for packages so you can prefer a certain index, for example your company package index or pytorch's own index (for ML work).
- very fast, makes local dev very seamless and is really helpful in CI/CD where you might just setup and tear down python envs a lot.
Also, the team is responsive on Github so it's easy to get help.
rschiavone|11 months ago
You don't need `pyenv`, `poetry` and `pipx` anymore, `uv` does all of that for you.
shellac|11 months ago
It's a much more complete tool than pip. If you've used poetry, or (in other languages) cargo, bundler, maven, then it's like that (and faster than poetry).
If you haven't, in addition to installing dependencies it will manage and lock their versions (no requirements.txt, and much more robust), look after the environment (no venv step), hold your hand creating projects, and probably other things.
Edit to add: the one thing it won't do is replace conda et al, nor is it intended to.
atoav|11 months ago
If you pip install something, you install it on the system python (the python binary located at sys.executable). This can break systems if the wrong combination of dependencies comes together. This is why you should never install things via pip for other people, unless you asked them first.
Now how else would you install them? There is a thing called virtual environments, which basically allows you to install pip dependencies in such way, they are only there within the context of the virtual environment. This is what you should do when you distribute python programs.
Now the problem is how do you ensure that this install to the virtual environment uses specific versions? What happens when one library depends on package A with version 1.0 and another library depends on a package with version 2.0? Now what happens if you deploy that to an old debian with an older python version.. Before uv I had to spend literal days to resolve such conflicts.
uv solves most of these problems in one unified place, is extremely performant, just works and when it does not, it tells you precisely why.
BiteCode_dev|11 months ago
The td;rd is that is has a lot less modes of failure.
montebicyclelo|11 months ago
globular-toast|11 months ago
The real point of uv is to be more than pip, though. It can manage projects, so basically CLI commands to edit your `pyproject.toml`, update a lockfile, and your venv all in one go. Unlike earlier tools it implements a pretty natural workflow on top of existing standards where possible, but for some things there are no standards, the most obvious being lockfiles. Earlier tools used "requirements.txt" for this which was quite lacking. uv's lockfile is cross-platform, although, admittedly does produce noisier diffs than requirements.txt, which is a shame.
jonatron|11 months ago
__mharrison__|11 months ago
Using uv is an easy sell to anyone who has worked with Python.
Great work Charlie and team.
bsdz|11 months ago
I wonder what it would take to get poetry on par with uv for those who are already switching to it? Poetry is definitely very slow downloading multiple versions of packages to determine dependencies (not sure how uv works around this?). Does uv have a better dependency checker algorithm?
chippiewill|11 months ago
Dependency resolution is slow because it's computationally very expensive. Because uv is written in Rust the resolution is just much much faster. IIRC they actually reuse the same resolution package that Cargo (Rust's package manager) uses.
wiseowise|11 months ago
Different laws of physics, to start with.
dagw|11 months ago
Poetry and uv have quite different philosophy. Poetry is incredibly opinionated in how you should do things, and trying to make poetry fit an existing project or combining poetry with other tools is quite likely to break. Uv on the hand is far more flexible and easier to make work with your current workflow. For me that was the main reason I gave up poetry, and in that aspect poetry will probably never be 'on par' with uv since these aren't technical differences, but differences of philosophy.
rmholt|11 months ago
Euphorbium|11 months ago
ThibWeb|11 months ago
freeamz|11 months ago
emblaegh|11 months ago
SilverSlash|11 months ago
mvATM99|11 months ago
Ruff for me meant i could turn 4 pre-commit hooks (which you have to configure to be compatible with each other too) into just 1, and i no longer dread the "run Pylint and take a coffee break" moment.
I jumped ship to UV recently. Though i was skeptical at first i don't regret it. It makes dependency management less of a chore, and just something i can quickly do now. Switching from Poetry was easy for me too, only package i had issues with was pytorch, but that just required some different toml syntax.
weberer|11 months ago
orthoxerox|11 months ago
DangitBobby|11 months ago
zoobab|11 months ago
UV does not solve all the hard problems.
Maybe switch to Pixi?
quleap|11 months ago
Vaslo|11 months ago
DHolzer|11 months ago
Last week, I started developing directly in PyTorch containers using just pip and Docker. With GPU forwarding on Windows no longer being such a hassle, I'm really enjoying the setup. Still, I can’t shake the feeling that I might be overlooking something critical.
I’d love to hear what the HN crowd thinks about this type of env.
fluidcruft|11 months ago
https://docs.astral.sh/uv/guides/integration/pytorch/
If the platform (OS) solution works for you that's probably the easiest. It doesn't for me because I work on multiple Linux boxes with differing GPUs/CUDAs. So I've use the optional dependencies solution and it's mostly workable but with an annoyance that uv sync forgets the --extra that have been applied in the venv so that if you "uv add" something it will uninstall the installed torch and install the wrong one until I re-run uv sync with the correct --extra again. (uv add with --extra does something different) And honestly I appreciate not having hidden venv states but it is a bit grating.
There are some ways to setup machine/user specific overrides with machine and user uv.toml configuration files.
https://docs.astral.sh/uv/configuration/files/
That feels like it might help but I haven't figured out how to configure get that to help it pick/hint the correct torch flavor for each machine. Similar issues with paddlepaddle.
Honestly I just want an extras.lock at this point but that feels like too much of a hack for uv maintainers to support.
I have been pondering whether nesting uv projects might help so that I don't actually build venvs of the main code directly and the wrapper depends specifically on certain extras of the wrapped projects. But I haven't looked into this yet. I'll try that after giving up on uv.toml attempts.
jerrygenser|11 months ago
I also use it in docker to build the container.
mbeex|11 months ago
https://docs.astral.sh/uv/guides/integration/pytorch/#instal...
sireat|11 months ago
First impressions of uv are quite nice, but how does one change Python versions once you have a project up?
I installed 3.13 with `uv python install python3.13`
I see bunch of Python versions now with `uv python list` (uv even found my old Anaconda 3.9 install from way back)
But how would I switch to 3.13?
LLM hallucinates with `uv venv use 3.13` but that command does not work.
I see from https://docs.astral.sh/uv/concepts/projects/config/#python-v... that one can specify the version in pyproject.toml, but should not there be a command line to switch?
est|11 months ago
XorNot|11 months ago
Took a huge chunk of complexity out of bootstrapping projects which I was otherwise handling myself.
globular-toast|11 months ago
But being written in Rust means I'm having to also finally get somewhat proficient in Rust. Can any Rust developers comment on the quality of the uv codebase? I find it surprisingly procedural in style, with stuff like hundreds of `if dry_run` type things scattered throughout the codebase, for example. Is this normal Rust style?
brokegrammer|11 months ago
uv tool is also a great replacement for pipx.
I think it's the way to go for Python dependency management in 2025.
aosaigh|11 months ago
drexlspivey|11 months ago
OtherShrezzing|11 months ago
stavros|11 months ago
oguz-ismail|11 months ago
[deleted]
jimmydoe|11 months ago
zachwill|11 months ago
yu3zhou4|11 months ago
rob|11 months ago
Had an issue running something on the latest Python version installed (3.13) but only needed to 'downgrade' to 3.11 to run that particular script. Just a simple:
`uv run --python 3.11 --with openai-whisper transcribe.py`
And no need to mess with anything else.
jessekv|11 months ago
type checking
bsdice|11 months ago
That is why for projects I resolve everything by hand, add all coarsely audited 3rd party libraries to ./lib/, and the main entry file then does this:
#!/usr/bin/env -S /bin/sh -c "_top_dir=\"\$(dirname \"\$(realpath -s \"\$0\")\")\"; cd \"\$_top_dir\"; exec \"\$_top_dir/python/install/bin/python3\" -W once::DeprecationWarning -X dev \"\$0\" \"\$@\""
import os
import sys
# Insert ./lib/ in front of search path
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "lib"))
...
I like the excellent standalone CPython by indygreg, now under astral-sh's github organization. Unpack as is into ./python/ and done. Because Arch Linux would just roll forward to whatever version is latest, introducing new warnings every month or two and havoc on any new major version.
Project is fully portable, copy anywhere that has glibc, run.
toenail|11 months ago
rubenvanwyk|11 months ago
user9999999999|11 months ago
nikisweeting|11 months ago
jessekv|11 months ago
calmoo|11 months ago
technopol|11 months ago
rglullis|11 months ago
stavros|11 months ago
I was one of the last holdouts, preferring to keep 2.7 support if it wasn't too much hassle, but we have to move on at some point. Fifteen years is long enough support.
Tewboo|11 months ago
bootsmann|11 months ago
unknown|11 months ago
[deleted]
karel-3d|11 months ago