(no title)
Alir3z4 | 1 year ago
I create a venv. Pip install and keep my direct deps in requirements.txt
That's it. Never understood all these python dependency management problems dramas.
Recently, I started using pyproject.toml as well which makes the whole thing more compact.
I make lots of python packages too. Either I go setup.py or sometimes I like to use flit for no specific reason.
I haven't ever felt the need for something like uv. I'm good with pip.
Borealid|1 year ago
To really pin everything you'd need to use something like asdf, on top of poetry or a manual virtualenv.
Otherwise you get your colleagues complaining that pip install failed with mysterious errors.
nyrikki|1 year ago
Even in huge monorepos you can just use something like a Makefile to produce a local venv using PHONY and add it to clean too
This is how I actually test old versions of python, with versioned build targets, cython vs ...
You can set up almost any IDE to activate them automatically too.
The way to get you coworkers to quit complaining is to automate the build env setup, not fighting dependency hell, which is a battle you will never win.
It really is one of the most expensive types of coupling.
alkh|1 year ago
zahlman|1 year ago
No matter what tooling you have, that kind of test is really the only way to be sure anyway.
If something doesn't work, I can play around with dependency versions and/or do the appropriate research to figure out what's required for a given Python version, then give the necessary hints in my `pyproject.toml` (https://packaging.python.org/en/latest/specifications/pyproj...) as environment markers on my dependency strings (https://peps.python.org/pep-0508/#environment-markers).
"Mysterious errors" in this area are usually only mysterious to end users.
marky1991|1 year ago
(Of course, I don't distribute most of my projects, so I just dump them all in the global install and don't worry about it)
merb|1 year ago
rcxdude|1 year ago
thingification|1 year ago
People end up committing either one or the other, not both, but:
- You need the source code, else your project is hard to update ("why did they pick these versions exactly?" - the answer is the source code).
- You need the compiled pinned versions in the lock file, else if dependencies are complicated or fast-moving or a project goes unmaintained, installing it becomes a huge mindless boring timesink (hello machine learning, all three counts).
Whenever I see people complaining about python dependencies, most of the time it seems just that somebody lacked this concept, or didn't know how to do it with python, or are put off by too many choices? That plus that ML projects are moving quickly and may have heavy "system" dependencies (CUDA).
thingification|1 year ago
In the source code - e.g. requirements.in (in the case of pip-tools or uv's clone of that: uv pip compile + uv pip sync), one lists the names of the projects one's application depends on, with a few version constraints explained with comments (`someproject <= 5.3 # right now spamalyzer doesn't seem to work with 5.4`).
In the compiled output - i.e. the lock files (pip-tools or uv pip sync/compile use requirements.txt for this) one makes sure every version is pinned to one specific version, to form a set of versions that work together. A tool (like uv pip compile) will generate the lock files from the source code, picking versions that are declared (in PyPI metadata) should work together.
My advice: pip-tools (pip-compile + pip-sync) does this very nicely - even better, uv's clone of pip-tools (uv pip compile + uv pip sync), which runs faster. Goes nicely with:
- pyproject.toml (project config / metadata)
- plain old setuptools (works fine, doesn't change: great)
- requirements.in: the source for pip-tools (that's all pip-tools does: great! uv has a faster clone)
- pyenv to install python versions for you (that's all it does: great! again uv has a faster clone)
- virtualenv to make separate sandboxed sets of installed python libraries (that's all it does: great! again uv has a faster clone)
- maybe a few tiny bash scripts, maybe a Makefile or similar just as a way to list out some canned commands
- actually write down the commands you run in your README
PS: the point of `uv pip sync` over `uv pip install -r requirements.txt` is that the former will uninstall packages that aren't explicitly listed in requirements.txt.
uv also has a poetry-like do-everything 'managed' everything-is-glued-together framework (OK you can see my bias). Personally I don't understand the benefits of that over its nice re-implementations of existing unix-y tools, except I guess for popularizing python lockfiles - but can't we just market the idea "lock your versions"? The idea is the good part!
gkhartman|1 year ago
mrbungie|1 year ago
Never had a problem making reproducible builds doing so.
Fethbita|1 year ago
Alir3z4|1 year ago
TBH, I've seen tutorials or even some companies simply do `pip freeze > requirements.txt` :shrug: which is a mess.
atoav|1 year ago
Alir3z4|1 year ago
`pyproject.toml` let's you set the min python version. If not met, it won't install.
Regardless, majority of the times, deployment is done via Docker.