My pet peeve (in general, not specific to UV, which I genuinely appreciate) is using comment sections for controlling code execution.
Using comments for linters and developer notes is perfectly acceptable. However, for configuration or execution-related data, a far superior pattern would be something like:
You’re using a magic constant that doesn’t do anything at runtime. It’s only there to be parsed by static analysis. In your case that’s uv doing the parsing but another tool might delete it as unused code. In the sense that it’s one thing pretending to be another, for me, it’s in the same category as a magic comment.
Instead, why not make a call to uv telling it what to do?:
import uv
uv.exec(
dependencies=[“clown”],
python=“>=3.10”,
)
from clown import nose
The first call can be with any old python runtime capable of locating this hypothetical uv package. The uv package sets up the venv and python runtime and re-exec(3)s with some kind of flag, say, an environment variable.
In the second runtime uv.exec is a noop because it detects the flag.
That’s fair. I agree with other replies though that parsing and evaluating imperative code is a lot tougher and less flexible than adhering to the principle of least power and making it declarative data.
It’s also worth noting that using comments is exactly how the shebang line works in the first place. It’s just so well-ingrained after 45 years that people don’t notice that it’s a shell comment.
> if you remove all comments from your code, it should still execute identically
It still -does- execute identically. Provided you install the same dependencies.
I don't see this as changing the semantics of the code itself, rather just changing the environment in which the code runs. In that respect it is no different from a `#!/bin/bash` comment at the top of a shell script.
I completely agree. Hope something like this is eventually standardized.
Problem is that uv probably does not want to execute anything to find out dependencies, so it would have to be a very restrictive subset of python syntax.
The fact that is is needed at all of course highlights a weakness in the language. The import statements themselves should be able to convey all information about dependencies
I like uv and all, but I take exception to the "self-contained" claim in two regards:
1) The script requires uv to already be installed. Arguably you could make it a shell script that checks if uv is already installed and then installs it via curlpipe if not... but that's quite a bit of extra boilerplate and the curlpipe pattern is already pretty gross on its own.
2) Auto-creating a venv somewhere in your home directory is not really self-contained. If you run the script as a one-off and then delete it, that venv is still there, taking up space. I can't find any assertion in the uv docs that these temporary virtual environments are ever automatically cleaned up.
Right, you need to have uv installed, and if you don't, you'll probably have to install it manually or through `curl | sh`.
I think this is a valid complaint.
Something to consider is that it will become less of an issue as package managers include uv in their repositories.
For example, uv is already available in Alpine Linux and Homebrew: https://repology.org/project/uv/versions.
I tried to hack together a shebang with docker+uv to solve this kind of problem, and it sort of does because that’s maybe more common than uv for a random dev machine (especially since tfa says it’s a gong project).
This works but doesn’t cache anything so the download for every run is awkward. This can probably be fixed with a volume tho?
You usually have to install something before you can run a program on your computer, so installing uv doesn't seem that bad to me. I still wouldn't call this self-contained because when you run the program it downloads who knows what from the internet!
To me, fully self-contained is something more like an AppImage
Agree 100%. Using something like py2exe creates a self contained "python script". This comes with a lot of problems for the developer but minimum problems for the user.
Yup, and you can apply the same technique to any language. The obvious example is bash with all the dependencies specified, but I’ve also hacked up quick single file rust scripts using nix shebangs.
As mentioned in other comments, the "self-contained" claim depends on `uv` being installed.
For those who want a really self-contained Python script, I'd like to point out the Nuitka compiler [0]. I've been using it in production for my gRPC services with no issues whatsoever - just "nuitka --onefile run.py" and that's it. It Just Werks. And since it's a compiler, the resulting binary is even faster than the original Python program would be if it were bundled via Pyinstaller.
The author's GitHub page [1] contains the following text:
Other than software development, my passion would be no
other. It's my life mission to create the best Python
Compiler I can possibly do or die trying, ... of old
age.
I really like this pattern, but unfortunately I haven't been able to get it to work with my LSP (pyright, in Helix), even when running my editor via uv (`uv run hx script.py`).
I could always do `uv run --with whatever-it-is-i-need hx script.py`, but that's starting to get redundant.
This looks quite useful! Is uv a safer choice to use for deploying a python based project long term? I’m referring to the anaconda rug pull that happened- using it for managing dependencies about 5 years ago, but then they changed some rules so that any of my clients who are organizations with over 200 employees are no longer free to use anaconda. They must pay a commercial license
They can always stop developing or fork to a different license and all future work belongs under that license, but you can't back date licenses, so what exists is guaranteed Open Source. If you're super worried, you can create a fork and just keep it in sync.
But this is essentially true about any other OSS project so I wouldn't be concerned. As far as I'm aware, conda was never open sourced and had always distributed binaries.
I think anaconda's rug pull was on the repository (you can still use packages from conda-forge for free).
uv just uses pypi, so it would be just a question of changing from uv to pip, poetry or whatever, all packages would still be coming from the same place.
As I understand it, relicensing is possible when a project has a Contributor Licensing Agreement (CLA) which says that you're signing over your copyright to your contribution to the project's owners. (Who will eventually be bought out by the worst rich person you can think of - Yes, him.)
I peeked in uv's contributing guide and issues and didn't see any CLA. In PyTorch the CLA was mentioned at the top of the contributing guide.
Has anyone gotten this to work on Windows? I wanted to use this trick for some tooling for a game mod I'm working on but couldn't get the shebang trick to work.
I learned to love uv because of this usecase but I still find it against the Zen of Python that an official (and, dare I say, extremely useful!) PEP is not supported by the official Python tools.
This is the first time that Python didn't come with "batteries included" from my POV.
Now I also have two Python dependency managers in my system. I know there are volumes to talk about Python dependency management but all these years, as long as a project had a requirements.txt, I managed to stick to vanilla pip+venv.
That's been a bit of a trend for the Python build specs. Pretty sure the pyproject toml predates the tomllib library. So for a few versions you had to specify your module in a language that Python couldn't read natively.
Which is worse than just having a default way for including metadata that's not used. That's what makes it metadata after all. Otherwise it would just be Python syntax
So how does this guarantee that it will never raise some libc error, or similar? Unfortunately I have become sceptical about "self contained" distribution methods.
This isn't self contained in that sense, it's deferring dependency management to runtime, with uv apparently doing that reliably enough for the use case.
Hey I have done the same for Swift scripts! (Well I have rewritten what Homebrew’s creator did some time ago, but does not maintain anymore, to be precise.)
When Python has something akin to Tcl's starkits, then it'll be cooking with gas -- I might even use it again. Py2exe came close, but was not cross-platform.
That code is bad for several reasons including not catching+handling exceptions (and possibly retrying), and accessing the JSON properties w/o get()
The overhead of re-installing stuff and setting up a header seem very unnecessary to run a simple script
If this is about sending some Python for somebody else to run easily - the recipient should always check the code. You should never run arbitrary code. For example, there have been hacks performed using YAML loader (crypto exchange).
[+] [-] stared|1 year ago|reply
Using comments for linters and developer notes is perfectly acceptable. However, for configuration or execution-related data, a far superior pattern would be something like:
This approach has clear advantages:- It's valid Python syntax.
- It utilizes standard, easily-parsable data structures rather than ad-hoc comment parsing. It makes creation and validation smooth.
- Crucially, it adheres to a core principle: if you remove all comments from your code, it should still execute identically.
[+] [-] gorgoiler|1 year ago|reply
You’re using a magic constant that doesn’t do anything at runtime. It’s only there to be parsed by static analysis. In your case that’s uv doing the parsing but another tool might delete it as unused code. In the sense that it’s one thing pretending to be another, for me, it’s in the same category as a magic comment.
Instead, why not make a call to uv telling it what to do?:
The first call can be with any old python runtime capable of locating this hypothetical uv package. The uv package sets up the venv and python runtime and re-exec(3)s with some kind of flag, say, an environment variable.In the second runtime uv.exec is a noop because it detects the flag.
[+] [-] sorenjan|1 year ago|reply
https://peps.python.org/pep-0723/
[+] [-] JimDabell|1 year ago|reply
It’s also worth noting that using comments is exactly how the shebang line works in the first place. It’s just so well-ingrained after 45 years that people don’t notice that it’s a shell comment.
[+] [-] wavemode|1 year ago|reply
It still -does- execute identically. Provided you install the same dependencies.
I don't see this as changing the semantics of the code itself, rather just changing the environment in which the code runs. In that respect it is no different from a `#!/bin/bash` comment at the top of a shell script.
[+] [-] petters|1 year ago|reply
Problem is that uv probably does not want to execute anything to find out dependencies, so it would have to be a very restrictive subset of python syntax.
The fact that is is needed at all of course highlights a weakness in the language. The import statements themselves should be able to convey all information about dependencies
[+] [-] Svoka|1 year ago|reply
[+] [-] bityard|1 year ago|reply
https://news.ycombinator.com/item?id=43500124
https://news.ycombinator.com/item?id=42463975
I like uv and all, but I take exception to the "self-contained" claim in two regards:
1) The script requires uv to already be installed. Arguably you could make it a shell script that checks if uv is already installed and then installs it via curlpipe if not... but that's quite a bit of extra boilerplate and the curlpipe pattern is already pretty gross on its own.
2) Auto-creating a venv somewhere in your home directory is not really self-contained. If you run the script as a one-off and then delete it, that venv is still there, taking up space. I can't find any assertion in the uv docs that these temporary virtual environments are ever automatically cleaned up.
[+] [-] networked|1 year ago|reply
Another thing is that inline script metadata is a Python standard. When there is no uv on the system and uv isn't packaged but you have the right version of Python for the script, you can run the script with pipx: https://pipx.pypa.io/stable/examples/#pipx-run-examples. pipx is much more widely packaged: https://repology.org/project/pipx/versions.
[+] [-] photonthug|1 year ago|reply
This works but doesn’t cache anything so the download for every run is awkward. This can probably be fixed with a volume tho?
Something like this: https://hugojosefson.github.io/docker-shebang/#python
[+] [-] krupan|1 year ago|reply
To me, fully self-contained is something more like an AppImage
[+] [-] dazzawazza|1 year ago|reply
[+] [-] gcr|1 year ago|reply
[+] [-] benhurmarcel|1 year ago|reply
That’s a good point. I wonder if at least they are reused when you run the script several times.
[+] [-] samstave|1 year ago|reply
[deleted]
[+] [-] kissgyorgy|1 year ago|reply
[+] [-] skowalak|1 year ago|reply
[+] [-] falcor84|1 year ago|reply
Note that this is exactly the case in TFA - uv takes care of installing Python ad-hoc.
[+] [-] skavi|1 year ago|reply
https://nixos.wiki/wiki/Nix-shell_shebang
[+] [-] Tractor8626|1 year ago|reply
[+] [-] execat|1 year ago|reply
[+] [-] bheadmaster|1 year ago|reply
For those who want a really self-contained Python script, I'd like to point out the Nuitka compiler [0]. I've been using it in production for my gRPC services with no issues whatsoever - just "nuitka --onefile run.py" and that's it. It Just Werks. And since it's a compiler, the resulting binary is even faster than the original Python program would be if it were bundled via Pyinstaller.
The author's GitHub page [1] contains the following text:
[0] https://nuitka.net/[1] https://github.com/kayhayen
[+] [-] tiltowait|1 year ago|reply
I could always do `uv run --with whatever-it-is-i-need hx script.py`, but that's starting to get redundant.
[+] [-] mayli|1 year ago|reply
[+] [-] icameron|1 year ago|reply
[+] [-] godelski|1 year ago|reply
They can always stop developing or fork to a different license and all future work belongs under that license, but you can't back date licenses, so what exists is guaranteed Open Source. If you're super worried, you can create a fork and just keep it in sync.
But this is essentially true about any other OSS project so I wouldn't be concerned. As far as I'm aware, conda was never open sourced and had always distributed binaries.
[0] https://github.com/astral-sh/uv?tab=readme-ov-file#license
[+] [-] scarlehoff|1 year ago|reply
uv just uses pypi, so it would be just a question of changing from uv to pip, poetry or whatever, all packages would still be coming from the same place.
[+] [-] 01HNNWZ0MV43FF|1 year ago|reply
I peeked in uv's contributing guide and issues and didn't see any CLA. In PyTorch the CLA was mentioned at the top of the contributing guide.
Although, there should have been a community fork of the last FOSS version of Anaconda. That's what happened with Redis, and Redis uses a CLA: https://github.com/redis/redis/blob/unstable/CONTRIBUTING.md...
Don't ever sign a CLA, kids. Hell, only contribute to copyleft projects. We get paid too much to work for free.
[+] [-] holysoles|1 year ago|reply
[+] [-] mos_6502|1 year ago|reply
[1] https://bundler.io/guides/bundler_in_a_single_file_ruby_scri...
[+] [-] dharmab|1 year ago|reply
[+] [-] quickslowdown|1 year ago|reply
$> uv init --script <script_name>.py
$> uv add --script <script_name>.py <pkg1> <pkg2> ...
$> uv add --script <script_name>.py --dev <dev_pkg1> <dev_pkg2> ...
$> uv run <script_name>.py
Hope this helps :)
Source: https://docs.astral.sh/uv/guides/scripts/
[+] [-] sorenjan|1 year ago|reply
This was covered in a blog post about this same topic that was posted here a few days ago. According to that you have to omit the -S: https://thisdavej.com/share-python-scripts-like-a-pro-uv-and...
https://news.ycombinator.com/item?id=43500124
I haven't tried it myself, I simply changed the file association so all .py files are opened with uv run as standard.
https://docs.python.org/3/using/windows.html#python-launcher...
https://peps.python.org/pep-0397/
[+] [-] seabrookmx|1 year ago|reply
[+] [-] _mlbt|1 year ago|reply
https://pyinstaller.org
[+] [-] gus_massa|1 year ago|reply
[+] [-] skeledrew|1 year ago|reply
[+] [-] yallpendantools|1 year ago|reply
This is the first time that Python didn't come with "batteries included" from my POV.
Now I also have two Python dependency managers in my system. I know there are volumes to talk about Python dependency management but all these years, as long as a project had a requirements.txt, I managed to stick to vanilla pip+venv.
[+] [-] shiandow|1 year ago|reply
Which is worse than just having a default way for including metadata that's not used. That's what makes it metadata after all. Otherwise it would just be Python syntax
[+] [-] egeres|1 year ago|reply
- Uv's killer feature is making ad-hoc environments easy (valatka.dev): https://news.ycombinator.com/item?id=42676432
- Using uv as your shebang line (akrabat.com): https://news.ycombinator.com/item?id=42855258
[+] [-] krupan|1 year ago|reply
I also wondered why virtual environments were invented for Python when general environment managers (like Modules) already existed.
These packaging and environment problems have never been specific to Python
https://0install.net/ https://modules.sourceforge.net/
[+] [-] amelius|1 year ago|reply
[+] [-] maxerickson|1 year ago|reply
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] frizlab|1 year ago|reply
https://github.com/xcode-actions/swift-sh
[+] [-] northisup|1 year ago|reply
[+] [-] intellectronica|1 year ago|reply
[+] [-] bitwize|1 year ago|reply
[+] [-] redsky880|1 year ago|reply
That code is bad for several reasons including not catching+handling exceptions (and possibly retrying), and accessing the JSON properties w/o get()
The overhead of re-installing stuff and setting up a header seem very unnecessary to run a simple script
If this is about sending some Python for somebody else to run easily - the recipient should always check the code. You should never run arbitrary code. For example, there have been hacks performed using YAML loader (crypto exchange).
For dependencies, use the standard pyproject.toml
[+] [-] Szpadel|1 year ago|reply
my approach is to use python build-in venv https://gist.github.com/Szpadel/43794d606d9924e7fea3e63fb800...
that way you can run scripts with external packages with only basic python installation