top | item 29239049

(no title)

fer | 4 years ago

This is my decision flow and I rarely have an issue:

    Are you the end user of the Python code?
        Yes -> Is available in your distro?
            Yes->Use package manager
            No->Use pip install in user mode
        No -> Create virtualenv with the Python version you want (including pypy!) and do your pip thing there

Some extreme use cases may benefit from anaconda, but personally I've never needed to use it. My only pain point is dealing with legacy code that relies on PYTHONPATH. Nothing good ever starts setting PYTHONPATH.

discuss

order

nerdponx|4 years ago

> Create virtualenv with the Python version you want (including pypy!) and do your pip thing there

You might benefit from using Pipx in this case: https://pypa.github.io/pipx/

Pipx is good for the case of "I want to run a standalone Python application that is available through Pip, but not my system's package repo." This is a more common case than you might think.

It's a sensible alternative to `pip install --user`, and having self-contained deps for tool is a bit like `npm install --global` or even `volta install`.

WhyNotHugo|4 years ago

Yup, this kinda works.

It doesn't address the greater issue tho: that it's getting harder and harder for distributions to package things right, and provider packages for their users (evidenced by the fact that you need a second package manager just for python stuff).

WhyNotHugo|4 years ago

> Use pip install in user mode

This is a great recipe for disaster. Whatever you install in user mode will shadow anything installed system-wide, so when you try to run some system-wide project, it may now fail. I'm also not a fan of how it drops scripts into `./.local/bin`, since that's where I keep my own script, and is version controlled.

The installation will also be frozen and never get updated -- unless you remember to do it manually.

Finally, and worst of all, this leaves you in a dead end if your packages have conflicting dependencies, which is too often the case in Python-land.

pfranz|4 years ago

So you're suggesting always using virtualenv?

I used to just use pip to install to the system. Months/years later I would try to untangle the mess of packages I was just playing with, what the OS wanted/needed, I got those conflicting dependencies you mention, etc. I usually ended up reinstalling the OS. At the time I may not have been as knowledgeable about where the OS package manager keeps packages vs pip--but the whole thing wasn't very user-friendly either.

For years I've been installing into user knowing I can just blow it away. I've dabbled with virtualenv, but it's such a pain to set up and activate. If I have a few projects with similar libraries it's more of a pain to set them all up and switch around. If I end up using a script for something important, I just spend the extra time at that point to "package" it.

isolli|4 years ago

Hm, why? I'm a happy user of PYTHONPATH!

doubleunplussed|4 years ago

It's completely global, shared by all Python interpreters of all versions.

I set PYTHONPATH, but the code in that directory is solely small debugging utils of mine that I want available in every Python interpreter, and I make sure not to put anything more complex in there.

Uberphallus|4 years ago

The only justified situation I can find is when you are working on two (or more) independent components at the same time.

My pain point in particular with PYTHONPATH (or playing with sys.path) is that people tend to use it with the only purpose of making import lines shorter, which brings naming collisions of all sorts when you aren't creative enough.

cozzyd|4 years ago

Yeah, how else do you git clone some random package and immediately use it without "installing" it?

PYTHONPATH is simple and obvious how to use, and is similar to using LD_LIBRARY_PATH and friends.