Man it's frustrating to see this comic and read the comments here, because it's all so true. Python, in its noble quest for backwards compatibility, has accumulated so many different ways of packaging, distributing and installing libraries and apps that it's rivaling Google's chat apps.
Pipenv is the most promising solution today but is still very, very new. It's modeled after yarn and has been officially blessed as "The One True Way" of installing stuff by the Python documentation. It has a way to go still to be as good as yarn (especially in terms of speed). The Python ecosystem has never had proper declarative packages like package.json (setup.cfg can be used to have fully declarative package metadata, but I seem to be the only one using it that way), which is a problem for package managers.
To those suggesting it, Docker is great but you're still dealing with a package manager inside Docker, so that's a moot point. It avoids the need for virtualenv, kind of, but so does pipenv and it does so more reliably and reproducibly (pipenv implements lockfiles).
Pipenv solves a very real problem, but it has a lot of problems of its own:
- It's very slow: re-locking after updating 1 dependency often takes me ~1 minute.
- It has lots of bugs. To name a few in 11.6.9: clobbers comments in the Pipfile, inconsistently includes dependencies for other build environments in Pipfile.lock, stores the wrong index in Pipfile.lock for packages not on PyPI.
- They release multiple times per day, often breaking things in patch releases.
- Kenneth Reitz is quite unpleasant to deal with in GitHub issues, which I often have to because of the previous 2 issues.
From what I've heard, Pipenv "has been officially blessed" only insofar as its maintainer got commit access to PyPA's documentation and inserted a recommendation.
Sharing your awesome Python program is its Achilles heel. I have 5 computers I work on a weekly basis. I stopped using Python and Haskell due to all of this.
> Python, in its noble quest for backwards compatibility, has accumulated so many different ways of packaging, distributing and installing libraries and apps that it's rivaling Google's chat apps.
I'm not sure it's because of backwards compatibility. Perl has amazing backwards compatibility, but doesn't seem to suffer from this.
That's not said to pump up Perl or degrade Python, but because if you misidentify the problem, your proposes solution has a much lower likelihood of fixing it.
(Then again, maybe you mean something different by "backwards compatibility" than what I thought you meant)
As a Python dev it looks like I need to do some reading as I am still using virtualenv (it usually works fine but I have do encounter the occasional issue).
Can anyone give me a rundown of the benefits of other systems over virtualenv?
On a side note I hope Python isn't going to turn into a shambles like JavaScript - its certainly starting to look like it as far as package managers go.
> To those suggesting it, Docker is great but you're still dealing with a package manager inside Docker, so that's a moot point.
Package manager/dependency manager is must for any sort of mature mass production coding environment (e.g. cargo for Rust, Maven for Java etc.) Docker doesn't solve all that but at least you are doing it only once and replicate easily and predictably. And a lot of times you can reuse Docker images built by someone else.
Please forgive me for venting my frustration here, but my reaction to this is basically: oh great, yet another blinking tool to add to the list of misery that is the Python packaging experience. May as well check it out now before I have to do it in a hurry...
First of all it's not in Debian unstable. Ok, maybe it's just super new... so I'll install it with pip. A python3 -m pip install pipenv later and let's try it out!
$ cd ~/src/nexsan-exporter/nexsan-exporter
$ pipenv install
Creating a virtualenv for this project…
Using /usr/bin/python3.6m (3.6.5) to create virtualenv…
⠋Running virtualenv with interpreter /usr/bin/python3.6m
Using base prefix '/usr'
New python executable in /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python3.6m
Also creating executable in /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python
Installing setuptools, pip, wheel...done.
Virtualenv location: /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG
Installing dependencies from Pipfile.lock (ca72e7)…
0/0 — 00:00:00
To activate this project's virtualenv, run the following:
$ pipenv shell
Cute emoji and nice colours but... ~/.local/share? For a directory that will end up containing arch-specific libraries? Ok, I guess no one gives a shit about this in the modern world, oh well. Wait... "python3.6m"? That isn't the Python interpreter I asked for... but it seems to be a hardlink to the same file as /usr/bin/python3.6 so I guess maybe this is intentional? Anyway let's check out the venv...
So this is the Python folk's what, fourteenth attempt to get this right, and they are still copying the python executable into the virtual environment instead of symlinking it in? This seems to be a regression from venv, which seemed to get this right!
Right, time to install my dependencies... according to 'pipenv' the right command for this is 'pipenv install -e .' Hm, I wonder exactly what the -e option does?
$ pipenv install --help
* no menition of -e in the output *
:sadface:
Oh well, let's just run it blind!
$ pipenv install -e .
Installing -e .…
⠏
Error: An error occurred while installing -e .!
Directory '.' is not installable. File 'setup.py' not found.
Maybe I screwed up and ran this from the wrong directory?
$ ls setup.py
setup.py
Weird, what's going on here?
$ strace -f pipenv install -e . 2>&1 | grep setup\\.py
stat("/home/yrro/src/nexsan-exporter/nexsan-exporter/setup.py", {st_mode=S_IFREG|0644, st_size=1769, ...}) = 0
stat("/home/yrro/src/nexsan-exporter/setup.py", 0x7ffc013a0200) = -1 ENOENT (No such file or directory)
[pid 7365] stat("./setup.py", 0x7fff84176e30) = -1 ENOENT (No such file or directory)
stat("/home/yrro/src/nexsan-exporter/setup.py", 0x7ffc013a0300) = -1 ENOENT (No such file or directory)
write(2, "Directory '.' is not installable"..., 62Directory '.' is not installable. File 'setup.py' not found.
Oh FFS, I give up. I think I'll let pipenv pass me by for now.
There was Zope Buildout for years, that had declarative packaging format, used the cheese shop (biggest issue with conda) and isn't slow as molasses but it was and still is largely ignored by the larger community.
I just started using pipenv for a small personal project, and I like it. The usage is similar to git which I prefer over the weird bash activate mangling that a normal virtualenv does.
Python environment headaches have been largely solved for me thanks to Miniconda[1]. Not only do the environments isolate dependencies, they can easily use different versions of Python, and can include arbitrary binary packages too. It helps that for my work in biology an extensive number of packages are available from the bioconda[2] channel (with many non-bio packages from conda-forge[3]). Environments can be described via environment files, allowing them to be transferred to collaborators, managed with source control, or included with publications to support reproducible science. If you like virtualenvs, you may want to give Miniconda a try. One current limitation is that environment files cannot specify a source channel for specific packages--they're installed from channels based on the global channel precedence.
I used Conda recently and it was OK, but it's annoying that it can't install stuff directly from PyPI. Unless you're lucky, you still have to use Pip inside the Conda environment to install some packages.
But it's annoying how many packages like qiime make it basically a requirement to use Anaconda/miniconda. I really don't want to pollute my computer with multiple installations of even things like R just to run a package.
Homebrew installing Python and installing virtualenv and virtualenvwrapper makes life easy.
All of these methods people are listing that requiring me to do more than manage one file (requirements.txt) is sort of missing the point.
I get that a lot of Node developers are coming to Python to do things, and that's great. But the environments don't need the exact same tools. pipenv seems to fix a problem I never encounter with pip/virtualenv during daily development work.
When I first got mac, I've spent the whole evening trying to install python3+pipenv and get it working properly. Ended up nuking system's python 2.7 and installing it through homebrew.
I think the comic should be more aptly be titled "Life before pipenv".
Pipenv is the first (popular) sane tool to make Python accessible to programmers coming from other languages. Sure you can also just use bare virtualenv, but then you have to figure out how it works together with everything displayed in the picture.
All of which work fine, the real problem is installing Python with homebrew. If you just install Python from the binary on the website then any of those solutions will work with minimal drama.
It's funny because within the comment section there are already 2 different opinions on why this is not funny because $xyz is what every sane python dev should use.
There was a time where I would have strongly related to this. Thankfully, now I've started using pipenv whenever possible and it basically just works(TM).
Image alt text is "The Python environmental protection agency wants to seal it in a cement chamber, with pictoral messages to future civilizations warning them about the danger of using sudo to install random Python packages."
To add on to the mess, Poetry seems like a strong contestant for a well-designed Python dependency management solution: https://github.com/sdispater/poetry
One thing that amazes me about the modern PHP ecosystem is that when it comes to distributable PHP packages, there is only one package manager (composer) and one central package repo (packagist).
When it comes to binaries it's still a bit of a mess (homebrew? pecl? yum?) but the native PHP code story is clearer than any other language ecosystem I've seen.
Am I the only one using Nix (the package manager) for this? It's lovely and works for other languages and binary package management as well.
I encourage people to try it out, interactive usage looks something like
`nix-shell -p python36Packages.pyaml -p python36Packages.aiohttp --run "python some-script-with-dependencies.py"
`.
I've never understood the purpose of virtualenvwrapper. I only use three commands for administering virtual environments: "virtualenv -p python3 ~/envs/foo" to make one, ". ~/envs/foo/activate" to activate it, and "rm -r ~/envs/foo" to delete it. That is really not something that needs further simplification. What am I missing?
Legacy setups from previous decades, laziness, automated installers written by others (so they play by different rules), semi-broken packages with messed-up dependency graphs that require manual treatment. The comic is, of course, exaggerating - or at least I hope no one have things gone that bad.
> every sane programmer uses ... pip in a virtualenvwrapper environment
Check out Pipenv <https://docs.pipenv.org/> - you may like it. It aims to make things saner that bare pip+virtualenv{,wrapper}, and IMHO it really does.
I guess what Python needs to do is to crown one solution as the official, and then start imposing on PyPA packages to convert before a deadline or exclude. Then they can move the non-compliant packages to a legacy archive, and build on the one solution that the community chose, say in a poll. Granted I've only ever interacted pip and virtualenv and though it was simple; but looking at this, things don't have to be this complex.
The heart of the problem is that different people are doing different things with Python, and nobody wants to pay cognitive load for what they don't want to use.
The packagers and package tool makers of the world inherit ALL of the technical debt from ALL of the upstream software devs. They're either the liver or the colon. An upstream dev decided to not document which compiler they use during dev time? That's now your problem. A compiler maker (GNU, MSFT, etc.) decides to change how they distribute the C runtime? Congrats, now it's your packaging system's job to know how to differentiate between Windows 7 and Windows 10 running particular versions of Visual Studio.
Nobody wants a "single simple unified solution" for packaging more than the packaging tool makers and distro vendors. Believe me. But it's not going to happen until software devs become take more responsibility for what they build upstream, and how they build it.
I use Docker and like it over virutalenvs, but even without it, if you're just playing around/sandboxing, pip3 --user is your friend. Don't use pip for system packages ever! Let your distros package manager handle that.
Maybe I just never do anything complicated with Python, but I never have any the problems captured in this comic.
I install all my Python modules into the system's Python using my OS's package manager. I have both Python 2 and Python 3 installed but beyond having the same module installed in each of those locations, I have never found the need for multiple versions of the same module installed at the same time.
On the rare occasion where my OS doesn't provide a package I need, I use pip install --user. If it turns out to be something I'll want for the long term, I just knock together a quick package that I can install/uninstall using standard OS packaging tools.
One of the reasons I try to use Docker for everything is because it's a cross-language solution. It does add some complexity and has its drawbacks, but for me, it's worth it. I'd rather have one global dependency (Docker) than several (virtualenv for Python, nvm for Node, rbenv for Ruby, etc.), which all have their own idiosyncrasies.
I also like using Docker because it handles non-language resources like databases. Even installing one version of Postgres locally was painful. It's hard to imagine trying to deal with multiple instances and versions on the same machine (without something like Docker or Vagrant) if two projects use 9.x, and another uses 10.x.
Maybe this is obvious, but to informally install executables supplied by python packages, use a separate virtualenv for each one and place a symlink to the venv/bin/theexecutable in a location like ~/bin that you have on your $PATH.
It's almost - but not quite - this bad when using a "simple" local development environment. As others have said it's not nearly this bad when you're deploying to sane infrastructure, be it containerized or not.
One more thing to add to the mess: PyBOMBS [1] A while back GNU Radio went and reinvented the whole package manager thing on their own. Since then I constantly have problems integrating anything GNU Radio-related with the rest of the Python ecosystem. Random Python 2 modules being imported by Python 3? You bet it's some random gr module in the PYTHONPATH somewhere.
[+] [-] scrollaway|8 years ago|reply
Pipenv is the most promising solution today but is still very, very new. It's modeled after yarn and has been officially blessed as "The One True Way" of installing stuff by the Python documentation. It has a way to go still to be as good as yarn (especially in terms of speed). The Python ecosystem has never had proper declarative packages like package.json (setup.cfg can be used to have fully declarative package metadata, but I seem to be the only one using it that way), which is a problem for package managers.
To those suggesting it, Docker is great but you're still dealing with a package manager inside Docker, so that's a moot point. It avoids the need for virtualenv, kind of, but so does pipenv and it does so more reliably and reproducibly (pipenv implements lockfiles).
[+] [-] wintorez|8 years ago|reply
[+] [-] alexbecker|8 years ago|reply
- It's very slow: re-locking after updating 1 dependency often takes me ~1 minute.
- It has lots of bugs. To name a few in 11.6.9: clobbers comments in the Pipfile, inconsistently includes dependencies for other build environments in Pipfile.lock, stores the wrong index in Pipfile.lock for packages not on PyPI.
- They release multiple times per day, often breaking things in patch releases.
- Kenneth Reitz is quite unpleasant to deal with in GitHub issues, which I often have to because of the previous 2 issues.
From what I've heard, Pipenv "has been officially blessed" only insofar as its maintainer got commit access to PyPA's documentation and inserted a recommendation.
[+] [-] baldfat|8 years ago|reply
Racket's exe are so simple and quick to share.
https://docs.racket-lang.org/raco/exe.html(Edited tendon to heel)
[+] [-] kbenson|8 years ago|reply
I'm not sure it's because of backwards compatibility. Perl has amazing backwards compatibility, but doesn't seem to suffer from this.
That's not said to pump up Perl or degrade Python, but because if you misidentify the problem, your proposes solution has a much lower likelihood of fixing it.
(Then again, maybe you mean something different by "backwards compatibility" than what I thought you meant)
[+] [-] collyw|8 years ago|reply
Can anyone give me a rundown of the benefits of other systems over virtualenv?
On a side note I hope Python isn't going to turn into a shambles like JavaScript - its certainly starting to look like it as far as package managers go.
[+] [-] jwilk|8 years ago|reply
Oh? I don't see any mention of pipenv in the cpython git repo.
[+] [-] devy|8 years ago|reply
Package manager/dependency manager is must for any sort of mature mass production coding environment (e.g. cargo for Rust, Maven for Java etc.) Docker doesn't solve all that but at least you are doing it only once and replicate easily and predictably. And a lot of times you can reuse Docker images built by someone else.
[+] [-] yrro|8 years ago|reply
First of all it's not in Debian unstable. Ok, maybe it's just super new... so I'll install it with pip. A python3 -m pip install pipenv later and let's try it out!
$ cd ~/src/nexsan-exporter/nexsan-exporter $ pipenv install Creating a virtualenv for this project… Using /usr/bin/python3.6m (3.6.5) to create virtualenv… ⠋Running virtualenv with interpreter /usr/bin/python3.6m Using base prefix '/usr' New python executable in /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python3.6m Also creating executable in /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python Installing setuptools, pip, wheel...done.
Virtualenv location: /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG Installing dependencies from Pipfile.lock (ca72e7)… 0/0 — 00:00:00 To activate this project's virtualenv, run the following: $ pipenv shell
Cute emoji and nice colours but... ~/.local/share? For a directory that will end up containing arch-specific libraries? Ok, I guess no one gives a shit about this in the modern world, oh well. Wait... "python3.6m"? That isn't the Python interpreter I asked for... but it seems to be a hardlink to the same file as /usr/bin/python3.6 so I guess maybe this is intentional? Anyway let's check out the venv...
$ ls -l ~/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python3* lrwxrwxrwx 1 yrro yrro 10 Apr 30 17:06 /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python3 -> python3.6m lrwxrwxrwx 1 yrro yrro 10 Apr 30 17:06 /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python3.6 -> python3.6m -rwxr-xr-x 1 yrro yrro 4576440 Apr 30 17:06 /home/yrro/.local/share/virtualenvs/nexsan-exporter-Eq5p1XVG/bin/python3.6m
So this is the Python folk's what, fourteenth attempt to get this right, and they are still copying the python executable into the virtual environment instead of symlinking it in? This seems to be a regression from venv, which seemed to get this right!
Right, time to install my dependencies... according to 'pipenv' the right command for this is 'pipenv install -e .' Hm, I wonder exactly what the -e option does?
$ pipenv install --help
* no menition of -e in the output *
:sadface:
Oh well, let's just run it blind!
$ pipenv install -e . Installing -e .… ⠏ Error: An error occurred while installing -e .! Directory '.' is not installable. File 'setup.py' not found.
Maybe I screwed up and ran this from the wrong directory?
$ ls setup.py setup.py
Weird, what's going on here?
$ strace -f pipenv install -e . 2>&1 | grep setup\\.py stat("/home/yrro/src/nexsan-exporter/nexsan-exporter/setup.py", {st_mode=S_IFREG|0644, st_size=1769, ...}) = 0 stat("/home/yrro/src/nexsan-exporter/setup.py", 0x7ffc013a0200) = -1 ENOENT (No such file or directory) [pid 7365] stat("./setup.py", 0x7fff84176e30) = -1 ENOENT (No such file or directory) stat("/home/yrro/src/nexsan-exporter/setup.py", 0x7ffc013a0300) = -1 ENOENT (No such file or directory) write(2, "Directory '.' is not installable"..., 62Directory '.' is not installable. File 'setup.py' not found.
Oh FFS, I give up. I think I'll let pipenv pass me by for now.
[+] [-] sireat|8 years ago|reply
I am teaching adult beginners and it would be best to avoid the messy parts in the beginning.
I thought I could get by with just pip and briefly touching on virtualenv.
I sin myself by avoiding all the mess by cloning a full bare bones Lubuntu virtualization whenever I need a clean project.
Theoretically Docker should suffice but it is yet another layer of complexity.
[+] [-] bmarkovic|8 years ago|reply
[+] [-] cpburns2009|8 years ago|reply
[+] [-] tomkinstinch|8 years ago|reply
1. https://conda.io/miniconda.html
2. https://bioconda.github.io
3. https://conda-forge.org
[+] [-] icebraining|8 years ago|reply
[+] [-] jhbadger|8 years ago|reply
[+] [-] dsr_|8 years ago|reply
[+] [-] WDCDev|8 years ago|reply
[+] [-] znpy|8 years ago|reply
[+] [-] xnyanta|8 years ago|reply
The OS comes with a python runtime, which is different from the runtime you can download from python.org.
You can then also install it from Homebrew or from MacPorts.
I'm probably even missing something.
[+] [-] psychometry|8 years ago|reply
[+] [-] jimwalsh|8 years ago|reply
All of these methods people are listing that requiring me to do more than manage one file (requirements.txt) is sort of missing the point.
I get that a lot of Node developers are coming to Python to do things, and that's great. But the environments don't need the exact same tools. pipenv seems to fix a problem I never encounter with pip/virtualenv during daily development work.
[+] [-] iaml|8 years ago|reply
[+] [-] nicolashahn|8 years ago|reply
[+] [-] kissgyorgy|8 years ago|reply
[+] [-] hobofan|8 years ago|reply
Pipenv is the first (popular) sane tool to make Python accessible to programmers coming from other languages. Sure you can also just use bare virtualenv, but then you have to figure out how it works together with everything displayed in the picture.
[+] [-] zython|8 years ago|reply
[+] [-] Alex3917|8 years ago|reply
[+] [-] spookthesunset|8 years ago|reply
[+] [-] weberc2|8 years ago|reply
[+] [-] smilekzs|8 years ago|reply
[+] [-] dmart|8 years ago|reply
I hope it becomes the standard going forward.
[+] [-] clarkmoody|8 years ago|reply
https://news.ycombinator.com/item?id=11851871 https://archive.is/MVVU1
Explanation edit:
Image alt text is "The Python environmental protection agency wants to seal it in a cement chamber, with pictoral messages to future civilizations warning them about the danger of using sudo to install random Python packages."
[+] [-] Depllo|8 years ago|reply
[+] [-] BerislavLopac|8 years ago|reply
[+] [-] wruza|8 years ago|reply
It’s so tempting to just apt install everything until it’s too late.
[+] [-] djsumdog|8 years ago|reply
[+] [-] mkopinsky|8 years ago|reply
When it comes to binaries it's still a bit of a mess (homebrew? pecl? yum?) but the native PHP code story is clearer than any other language ecosystem I've seen.
[+] [-] wocram|8 years ago|reply
I encourage people to try it out, interactive usage looks something like `nix-shell -p python36Packages.pyaml -p python36Packages.aiohttp --run "python some-script-with-dependencies.py" `.
[+] [-] grx|8 years ago|reply
[+] [-] quietbritishjim|8 years ago|reply
[+] [-] drdaeman|8 years ago|reply
Legacy setups from previous decades, laziness, automated installers written by others (so they play by different rules), semi-broken packages with messed-up dependency graphs that require manual treatment. The comic is, of course, exaggerating - or at least I hope no one have things gone that bad.
> every sane programmer uses ... pip in a virtualenvwrapper environment
Check out Pipenv <https://docs.pipenv.org/> - you may like it. It aims to make things saner that bare pip+virtualenv{,wrapper}, and IMHO it really does.
[+] [-] _pfxa|8 years ago|reply
[+] [-] pwang|8 years ago|reply
The packagers and package tool makers of the world inherit ALL of the technical debt from ALL of the upstream software devs. They're either the liver or the colon. An upstream dev decided to not document which compiler they use during dev time? That's now your problem. A compiler maker (GNU, MSFT, etc.) decides to change how they distribute the C runtime? Congrats, now it's your packaging system's job to know how to differentiate between Windows 7 and Windows 10 running particular versions of Visual Studio.
Nobody wants a "single simple unified solution" for packaging more than the packaging tool makers and distro vendors. Believe me. But it's not going to happen until software devs become take more responsibility for what they build upstream, and how they build it.
[+] [-] neuland|8 years ago|reply
[+] [-] djsumdog|8 years ago|reply
[+] [-] kingofpandora|8 years ago|reply
I install all my Python modules into the system's Python using my OS's package manager. I have both Python 2 and Python 3 installed but beyond having the same module installed in each of those locations, I have never found the need for multiple versions of the same module installed at the same time.
On the rare occasion where my OS doesn't provide a package I need, I use pip install --user. If it turns out to be something I'll want for the long term, I just knock together a quick package that I can install/uninstall using standard OS packaging tools.
[+] [-] dguo|8 years ago|reply
I also like using Docker because it handles non-language resources like databases. Even installing one version of Postgres locally was painful. It's hard to imagine trying to deal with multiple instances and versions on the same machine (without something like Docker or Vagrant) if two projects use 9.x, and another uses 10.x.
[+] [-] Myrmornis|8 years ago|reply
[+] [-] LyndsySimon|8 years ago|reply
[+] [-] avian|8 years ago|reply
[1] https://www.gnuradio.org/blog/pybombs-the-what-the-how-and-t...