top | item 7802005

(no title)

chrishenn | 11 years ago

One pain point I've really felt recently with Python is in the deploy step. pip installing dependencies with a requirements.txt file seems to be the recommended way, but it's far from easy. Many dependencies (such as PyTables) don't install their own dependencies automatically, so you are left with a fragile one-by-one process for getting library code in place.

It was all fine once I got it worked out but it would so much nicer to provide a requirements.txt file and have pip figure out the ordering and dependency stuff. That and being able to install binary packages in a simple way from pip would make me much happier with python (no more waiting 10 minutes for numpy or whatever to compile).

As far as actual language features go however, I still find python a joy to work in.

discuss

order

yaph|11 years ago

If a package doesn't get all its dependencies installed via pip its because of missing information in the package itself. That's neither a flaw of pip or Python but will cause problems for any package manager.

I find the combination of virtual environments and pip very convenient to work with. When I run into trouble with missing dependencies I often find the project on GitHub and can send a pull request.

Regarding Numpy and the scientific Python stack, check out Anaconda https://store.continuum.io/cshop/anaconda/ it makes managing environments where you need these packages a lot less painful.

andreasvc|11 years ago

Not necessarily. pip cannot resolve all dependencies. For example, if a package specifies both numpy and pandas as a requirement, installation will fail. This is because pandas in turn requires numpy, and pip does not resolve the dependencies in a single step, you need to install numpy first and then go on with pandas.

Derbasti|11 years ago

I recently tried to deploy a desktop app written in Python. It was a nightmare. I recently taught scientific Python to prospective switchers. The installation step was a nightmare.

We really need pip wheels or conda to become mainstream. Pip alone doesn't cut it on platforms without compilers (Windows / OSX). Standalone installers are fine, but they don't resolve dependencies and they are only available for Windows.

I totally agree that the installation and deployment story should be a top priority. Once done, it would be another very compelling point on Python 3's feature list.

pekk|11 years ago

wheels are pretty mainstream and are getting better fast. Forking the package management ecosystem again, just after we got over the last round of headaches there, is only going to make things worse

L_Rahman|11 years ago

One of the reasons I've come to do most of my early stage prototyping in node is that managing npm packages has thus proven much easier than finagling with pip or gem.

Ruby installs are especially difficult to manage. Despite the numerous tutorials out there, I still don't know what, if there even is one, the canonically best way to install ruby and necessary gems is. RVM? Install through Brew? Add path to ~/.bashrc?

I say this as someone who likes using command line so much that I've written Caskfiles to automate my deployment to fresh OS X Machines.

pak|11 years ago

> Despite the numerous tutorials out there, I still don't know what, if there even is one, the canonically best way to install ruby and necessary gems is.

Wait, what? On production, install the exact ruby you need from your favorite package manager. On your dev box, install any rubies you need through rbenv [1]. Put all your gem dependencies into a Gemfile [2]. On either end, bundle install [--deployment] and call it a day.

[1]: https://github.com/sstephenson/rbenv

[2]: http://bundler.io/v1.6/gemfile.html

pmontra|11 years ago

I install rvm on my production boxes as explained at https://rvm.io/ then I put .ruby-version and .ruby-gemset files in the application directory to select the interpreter and the gemset (I might need to have different applications running - especially on staging machines). Finally I use bundle and a Gemfile. It's pretty easy. You got another answer suggesting rbenv which is also fine.

thinkpad20|11 years ago

At my company we build with Jenkins, tarball it and deploy from that. I haven't had experience another way but I think it ends up being more efficient and less error-prone than doing actual pip installs during deployment.

dagw|11 years ago

That's basically what the company I used to work for did as well with our large python application. Build using a build script, pulling in dependencies from local build server, test, package up the result and deploy. Pip was used for pulling in libraries to the build machine and never used during deployment.

marcosdumay|11 years ago

You should avoid using normal requirements.txt files in production. If you want to use pip, it's better to "pip freeze" your test environment, and use that requirements file to specify the production environment.

Otherwise you are just asking for nasty surprises when packages upgrade.

erichurkman|11 years ago

Or just specify versions in your requirements.txt file to begin with.

If you want to keep up to date with security and bug fixes (but aren't yet ready for the next big feature/backwards incompatible release), you can specify the lines as 'package>=1.1,<1.2' to get 1.1.x fix releases.

`pip list --outdated` is helpful, too.

andreabedini|11 years ago

(dormant) PyTables developer here, we do have a requirements.txt file but I understand our setup.py needs an update. Please open an issue on github and tell us about your experience and how we can improve it.

chrishenn|11 years ago

I will soon, thanks!

ermintrude|11 years ago

Check out devpi (http://doc.devpi.net/latest/). We build our projects as wheels with pip using devpi as a custom index, i.e.:

BUILD_DIR="/tmp/wheelhouse/$PROJECT" pip wheel --wheel-dir=$BUILD_DIR --find-links=$BUILD_DIR --index=http://$DEVPI_HOST:$DEVPI_PORT/${DEVPI_USER}/${DEVPI_INDEX} --extra-index-url=https://pypi.python.org/simple/ ../$PROJECT

This will try to get packages from your devpi index, but will fall back to pypi if you don't have them.

Then upload this to devpi. It can later be installed by pip with:

pip install --use-wheel --index=http://$DEVPI_HOST:$DEVPI_PORT/${DEVPI_USER}/${DEVPI_INDEX} $PROJECT

This makes deployments super fast because you're deploying pre-built wheels instead of downloading and compiling from pypi. It also gives you resiliance by storing copies of the dependencies you need in devpi, so if they vanish from pypi (or it's unavailable) you can still deploy your software with all the dependencies you developed against.

SoftwareMaven|11 years ago

There is work going on in the python packaging SIG to make installing better. One of the big additions for pip is the ability to install pre-built wheels for those "hard to build" packages. There is also work happening on a new package metadata standard.

Unfortunately, that's all I've learned from lurking on the mailing list for a couple weeks.

Alphasite_|11 years ago

In my experience, you can bundle the dependancies, and then add the path to your search path.

Not the best approach, but it does work.

maaku|11 years ago

`pip freeze`