top | item 43097744

(no title)

thisgoodlife | 1 year ago

I create a .venv directory for each project(even for those test projects named pytest, djangotest). And each project has its own requirements file. Personally, Python packaging has never been a problem.

discuss

order

lmm|1 year ago

What do you do when you accidentally run pip install -r requirements.txt with the wrong .venv activated?

If your answer is "delete the venv and recreate it", what do you do when your code now has a bunch of errors it didn't have before?

If your answer is "ignore it", what do you do when you try to run the project on a new system and find half the imports are missing?

None of these problems are insurmountable of course. But they're niggling irritations. And of course they become a lot harder when you try to work with someone else's project, or come back to a project from a couple of years ago and find it doesn't work.

zahlman|1 year ago

>What do you do when you accidentally run pip install -r requirements.txt with the wrong .venv activated?

As someone with a similar approach (not using requirements.txt, but using all the basic tools and not using any kind of workflow tool or sophisticated package manager), I don't understand the question. I just have a workflow where this isn't feasible.

Why would the wrong venv be activated?

I activate a venv according to the project I'm currently working on. If the venv for my current code isn't active, it's because nothing is active. And I use my one global Pip through a wrapper, which (politely and tersely) bonks me if I don't have a virtual environment active. (Other users could rely on the distro bonking them, assuming Python>=3.11. But my global Pip is actually the Pipx-vendored one, so I protect myself from installing into its environment.)

You might as well be asking Poetry or uv users: "what do you do when you 'accidentally' manually copy another project's pyproject.toml over the current one and then try to update?" I'm pretty sure they won't be able to protect you from that.

>If your answer is "delete the venv and recreate it", what do you do when your code now has a bunch of errors it didn't have before?

If it did somehow happen, that would be the approach - but the code simply wouldn't have those errors. Because that venv has its own up-to-date listing of requirements; so when I recreated the venv, it would naturally just contain what it needs to. If the listing were somehow out of date, I would have to fix that anyway, and this would be a prompt to do so. Do tools like Poetry and uv scan my source code and somehow figure out what dependencies (and versions) I need? If not, I'm not any further behind here.

>And of course they become a lot harder when you try to work with someone else's project, or come back to a project from a couple of years ago and find it doesn't work.

I spent this morning exploring ways to install Pip 0.2 in a Python 2.7 virtual environment, "cleanly" (i.e. without directly editing/moving/copying stuff) starting from scratch with system Python 3.12. (It can't be done directly, for a variety of reasons; the simplest approach is to let a specific version of `virtualenv` make the environment with an "up-to-date" 20.3.4 Pip bootstrap, and then have that Pip downgrade itself.)

I can deal with someone else's (or past me's) requirements.txt being a little wonky.

NewJazz|1 year ago

uv basically does that + python version handling + conveniences like auto-activating venv and installing dependencies

kyawzazaw|1 year ago

it was a massive problem at our company's hackathon. just so many hours wasted

andrewflnr|1 year ago

Yeah, this is where I've been for a while. Maybe it helps that I don't do any ML work with lots of C or Fortran libraries that depend on exact versions of Python or whatever. But for just writing an application in Python, venv and pip are fine. I'll probably still try uv eventually if everyone really decides they're adopting it, but I won't rush.