top | item 34858934

(no title)

lazka | 3 years ago

The next Debian/Ubuntu releases will no longer allow `pip install` outside of a venv: https://discuss.python.org/t/pep-668-marking-python-base-env...

You can still force it via `pip install --break-system-packages ...` if needed.

discuss

order

unxdfa|3 years ago

This makes me so happy. Back when we had Jenkins slaves, one of our devops guys set a pipeline up that pip installed different versions over the top of system packages causing weird intermittent failures everywhere. Different pipelines would be running in different requirements files. I revoked sudo privs immediately for Jenkins (I didn't add them in the first place) and reprovisioned the whole build cluster resulting in pipelines breaking consistently where they should have been breaking: trying to do stupid stuff.

Personally I only ever use the system python packages on Linux if I can get away with it. Saves a whole world of problems.

KronisLV|3 years ago

> Back when we had Jenkins slaves, one of our devops guys set a pipeline up that pip installed different versions over the top of system packages causing weird intermittent failures everywhere.

Not everyone might like containers, but using them for CI seems like a good way to avoid situations like this, at least when viable (e.g. web development). You get to choose what container images you need for your build, do whatever is necessary inside of them and they're essentially thrown away after once you're done with what you need to do, cache aside. They also don't have any significant impact or dependencies on the system configuration either, as long as you have some sort of a supported container runtime.

hnfong|3 years ago

> I revoked sudo privs immediately for Jenkins (I didn't add them in the first place)

If you allowed sudo in your jenkins jobs you're morally barred from blaming python for screwing up the system.

wheelerof4te|3 years ago

The way it's meant to be.

On Linux, you either use the system packages via "apt install", or you use venvs.

EDIT: For context, I've meant "managed" distros like Debian and Ubuntu.

qbasic_forever|3 years ago

Nowhere in the official Python documentation (where 99% of new python users are going to go) does it warn or even talk about Linux and Debian specific issues like only using apt packaged versions of dependencies. It wasn't even until recent years that pip gave a hint or warning something might break in those setups. The situation with Python on Debian has been pretty bad IMHO with a cloistered group of people saying the status quo is just fine because it works for them exclusively.

Groxx|3 years ago

This is fantastic to hear. Hopefully this will be the beginning of a wave of other OSes doing the same.

For anyone on other systems who wants this kind of protection right now, pip has had this available for a few years at least:

    pip config set global.require-virtualenv True
I absolutely recommend doing it. Immediately.

ziml77|3 years ago

Mixing pip with another package manager has always seemed weird to me. You're just asking for things to conflict and break.

I noticed with Homebrew that there was no way to untangle packages installed through pip and ones installed through Homebrew. After dealing with that mess once, I now make sure to use pip install --user. It can still cause things to break, but if that does happen it's at least easy to nuke the packages installed to my home directory.

TheRealPomax|3 years ago

Good. Now we just need to get pip itself updated so it refuses to run outside of a venv, and refuses to run unless invoked with "python -m pip" and we'll finally have something at least half decent.

And don't even get me started about how much better npm is at publishing packages, versus pip's refusal to add the same user friendliness.

meitham|3 years ago

Hopefully that’s not going to be the case inside a container!

forgotpwd16|3 years ago

Why? Can overwrite it and even if couldn't making a new venv is just a `python -m venv venv` away.