top | item 42420696

(no title)

Alir3z4 | 1 year ago

Almost 2 decades of working with python.

I create a venv. Pip install and keep my direct deps in requirements.txt

That's it. Never understood all these python dependency management problems dramas.

Recently, I started using pyproject.toml as well which makes the whole thing more compact.

I make lots of python packages too. Either I go setup.py or sometimes I like to use flit for no specific reason.

I haven't ever felt the need for something like uv. I'm good with pip.

discuss

order

Borealid|1 year ago

That doesn't work well (enough) if you have one project that requires Python <3.10 and another that requires Python >=3.10.

To really pin everything you'd need to use something like asdf, on top of poetry or a manual virtualenv.

Otherwise you get your colleagues complaining that pip install failed with mysterious errors.

nyrikki|1 year ago

venvs are namespace isolation, they are like containers.

Even in huge monorepos you can just use something like a Makefile to produce a local venv using PHONY and add it to clean too

This is how I actually test old versions of python, with versioned build targets, cython vs ...

You can set up almost any IDE to activate them automatically too.

The way to get you coworkers to quit complaining is to automate the build env setup, not fighting dependency hell, which is a battle you will never win.

It really is one of the most expensive types of coupling.

alkh|1 year ago

Would recommend you to install pyenv[1]. It was very useful when my team had to update a lot of projects using <=3.10 to 3.11 [1] https://github.com/pyenv/pyenv

zahlman|1 year ago

I have multiple versions of Python built from source. If I want to test what my code will do on a given version, I spin up a new venv (near instantaneous using `--without-pip`, which I achieve via a small Bash wrapper) and try installing it (using the `--python` option to Pip, through another wrapper, allowing me to reuse a single global copy).

No matter what tooling you have, that kind of test is really the only way to be sure anyway.

If something doesn't work, I can play around with dependency versions and/or do the appropriate research to figure out what's required for a given Python version, then give the necessary hints in my `pyproject.toml` (https://packaging.python.org/en/latest/specifications/pyproj...) as environment markers on my dependency strings (https://peps.python.org/pep-0508/#environment-markers).

"Mysterious errors" in this area are usually only mysterious to end users.

marky1991|1 year ago

I don't get it, you ought to be building a different venv per project anyway.

(Of course, I don't distribute most of my projects, so I just dump them all in the global install and don't worry about it)

merb|1 year ago

pyproject.toml solves this nowadays

rcxdude|1 year ago

This is more or less my experience, but I think in part it took a while for pip to actually get into a usable position, hence some of the proliferation of other options.

thingification|1 year ago

That might be fine in your context. People's problems are real, though. What they're almost always missing is separating the source code from the compiled output ("lock files"). Pick a tool to help with that, commit both files to your ("one's") project, problem solved.

People end up committing either one or the other, not both, but:

- You need the source code, else your project is hard to update ("why did they pick these versions exactly?" - the answer is the source code).

- You need the compiled pinned versions in the lock file, else if dependencies are complicated or fast-moving or a project goes unmaintained, installing it becomes a huge mindless boring timesink (hello machine learning, all three counts).

Whenever I see people complaining about python dependencies, most of the time it seems just that somebody lacked this concept, or didn't know how to do it with python, or are put off by too many choices? That plus that ML projects are moving quickly and may have heavy "system" dependencies (CUDA).

thingification|1 year ago

To be more concrete:

In the source code - e.g. requirements.in (in the case of pip-tools or uv's clone of that: uv pip compile + uv pip sync), one lists the names of the projects one's application depends on, with a few version constraints explained with comments (`someproject <= 5.3 # right now spamalyzer doesn't seem to work with 5.4`).

In the compiled output - i.e. the lock files (pip-tools or uv pip sync/compile use requirements.txt for this) one makes sure every version is pinned to one specific version, to form a set of versions that work together. A tool (like uv pip compile) will generate the lock files from the source code, picking versions that are declared (in PyPI metadata) should work together.

My advice: pip-tools (pip-compile + pip-sync) does this very nicely - even better, uv's clone of pip-tools (uv pip compile + uv pip sync), which runs faster. Goes nicely with:

- pyproject.toml (project config / metadata)

- plain old setuptools (works fine, doesn't change: great)

- requirements.in: the source for pip-tools (that's all pip-tools does: great! uv has a faster clone)

- pyenv to install python versions for you (that's all it does: great! again uv has a faster clone)

- virtualenv to make separate sandboxed sets of installed python libraries (that's all it does: great! again uv has a faster clone)

- maybe a few tiny bash scripts, maybe a Makefile or similar just as a way to list out some canned commands

- actually write down the commands you run in your README

PS: the point of `uv pip sync` over `uv pip install -r requirements.txt` is that the former will uninstall packages that aren't explicitly listed in requirements.txt.

uv also has a poetry-like do-everything 'managed' everything-is-glued-together framework (OK you can see my bias). Personally I don't understand the benefits of that over its nice re-implementations of existing unix-y tools, except I guess for popularizing python lockfiles - but can't we just market the idea "lock your versions"? The idea is the good part!

gkhartman|1 year ago

That's been my experience too. The main complaint I hear about this workflow is that venvs can't be moved without breaking. I just rebuild my venv in each new new location, but that rebuild time can add up for projects with many large scientific packages. Uv solved that pain point for me, since it provides a "pip install" implementation that runs in a fraction of the time.

mrbungie|1 year ago

This. And also try always to fix the version of the requirements, and that's it.

Never had a problem making reproducible builds doing so.

Fethbita|1 year ago

I had issues with exactly this method. One of my dependencies was pulled off to a paid model so my project no longer worked.

Alir3z4|1 year ago

Yeah, I assume pinning the version is something everyone does? Or probably many just don't and will have those "python deps management is a mess drama".

TBH, I've seen tutorials or even some companies simply do `pip freeze > requirements.txt` :shrug: which is a mess.

atoav|1 year ago

Then you deploy to an old debian and everything falls apart.

Alir3z4|1 year ago

Not really.

`pyproject.toml` let's you set the min python version. If not met, it won't install.

Regardless, majority of the times, deployment is done via Docker.