Does anyone else think this reflects badly on Python? The fact that the author has to use a bunch of different tools to manage Python versions/projects is intimidating.
I don't say this out of negativity for the sake of negativity. Earlier today, I was trying to resurrect an old Python project that was using pipenv.
"pipenv install" gave me an error about accepting 1 argument, but 3 were provided.
Then I switched to Poetry. Poetry kept on detecting my Python2 installation, but not my Python3 installation. It seemed like I had to use pyenv, which I didn't want to use, since that's another tool to use and setup on different machines.
I gave up and started rewriting the project (web scraper) in Node.js with Puppeteer.
Granted, I'm just a scientific programmer, but my workplace has a full blown software team maintaining a multi million line codebase. That codebase is rebuilt every night, and as I understand it, you're not allowed to submit a change that breaks anything. And they have people whose job is to keep their tools working.
What people casually think of as "Python" is really a huge dynamic ecosystem of packages. Imagine that there are 60k packages, each with 1k lines of code... that's a 60 million line codebase, and it can't be checked for breaking changes. Short of continually testing your code against the latest versions of packages, you're going to hit some bumps if you haul an old code out of the vault and try to fire it up on a new system.
I don't know how Javascript developers handle this.
I handle it by running Python inside an isolated environment -- WinPython does this for me -- and occasionally having to fix something if a new version of a package causes a breaking change.
The drawback of my method is deployment -- there is no small nugget of code that I can confidently share with someone. They have to install their own environment for running my stuff, or take their chances, which usually ends badly.
I love Python. Throughout my life I tried learning many languages, and Python is the only one that really stuck, and was able to do useful things. Learning Python changed my life, in 5 years my salary more than doubled, and for the last 5 years I've been a full time developer. A coworker likes to say that I think in Python.
That said, I 100% agree. I don't have the answer, except that I wish that there was one official answer, for developers and deployment that was easy to explain to beginners.
For what it's worth I've been using pipenv for over a year and it works good enough. I think npm's approach is better, but not perfect. I've heard good things about yarn. I know CPAN is the grandfather of all of them. I've barely used gems but they seem like magic, and Go get your repo URI out of my code please and thank you. :-) All fun aside, what languages have it right? and is there maybe a way to come up with a unified language dependency manger?
This is why I manage every nontrivial project I do nowadays with Nix (Haskell, Go, Python, C & C++, bash ..anything)
Everything is pinned to exact source revisions. You can be relatively sure to be able to git clone and nix-shell and be off to the races.
You can even go the extra mile and provide working editor integration in your nix-shell (especially easy with emacs). So you can enable anyone to git clone, nix-shell, and open a working editor.
The biggest downside is becoming proficient in Nix isn't easy or even straightforward.
The big gap is management of the full dependency tree. With yarn I can get a package.lock which pretty well ensures I'll have the same exact version of everything, with no unexpected changes, every time I run yarn install. I get the same thing in the Rust world with Cargo.
In Python it's a mess. Some packages specify their deps in setup.py; some in a requirements file, which may or may not be read in by their setup.py. It's not rare to have to have multiple 'pip install' commands to get a working environment, especially when installing fairly boutique extensions to frameworks like Django.
There just isn't a holistic, opinionated approach to specifying dependencies, especially at a project level. Which leaves unexpected upgrades of dependencies (occasionally leading to regressions) as a reality for Python devs.
Yes. As someone who has never dove deep into python, but has had some contact with it: the package manager ecosystem is the #1 thing keeping me away from it.
npm sucks and all, but at least it just works and doesn't get in my way as much.
Just spent ~1 hour trying to set up a working python environment .... so yes. It's in the classic phase where it has an ecosystem with a bunch of problems that are small enough that they aren't being tackled comprehensively by the core language, but large enough that n different solutions are being created in parallel by different groups. The result is an explosion in complexity for anybody just trying to get their job done and ... it's actually very unpythonic!
The alternative is an opinionated build system defined by the language developers.
Like any dictatorship, that can be fine if those in charge are benevolent and competent. For programming languages, the first is almost always true, but the second is far from guaranteed. Skill at programming language development has no bearing on skill at developing a build, packaging, and distribution system.
Go is a prime example of this. The language is pleasant to use with few ugly surprises, but their build system has been awful for a decade, only now reaching a semi-decent state with modules (which are still pretty damn ugly).
With python, on the other hand, there's competition in this space, and as a result the tools are pretty nice, albeit fragmented.
But then there's rust, which has a nice language AND a nice build system. You take a big risk when building both the language and build system; sometimes it works, sometimes it doesn't. And you risk fragmentation if you don't. It's a tough choice.
For me the biggest problem is C-based python modules that can't just be installed in your virtual environment but want to be part of the global installation.
Tkinter and mpi4py are the most recent ones I've had this problem with. I expect someone will tell me "it's trivial to install these in a venv, just do X", but X is not obvious to me.
Compared to other languages and ecosystems, it really is lagging behind. Depedendency and version management were afterthoughts in Python. I dread having to maintain our Python projects.
> The fact that the author has to use a bunch of different tools to manage Python versions/projects is intimidating.
It just shows that Python is used for a lot of purposes and there is no single tool that handles all usecases.
> I gave up and started rewriting the project (web scraper) in Node.js with Puppeteer.
And when you'll get to the point where you need to work with different node projects, you will also need several tools to manage node versions and different environments, so that doesn't help at all.
I'm a Java developer and I make fun of Maven and Gradle as much as anyone, but overall it seems like I am better off than I would be in the Python ecosystem for dependency management as well as managing the version of the language I compile and run with.
To be fair, he says in the article that his requirements are somewhat different from most Python developers, to wit:
* "I need to develop against multiple Python versions - various Python 3 versions (3.6, 3.7, 3.8, mostly), PyPy, and occasionally Python 2.7 (less and less often, thankfully)."
* "I work on many projects simultaneously, each with different sets of dependencies, so some sort of virtual environment or isolation is critical."
I think it really comes down to Python not having a chosen way to handle package management as well as Python being dependent on the underlying C libraries and compilers for the given platform.
Since Python did not prescribe a way to handle it the community has invented multiple competing ways to solve the problem, most of which have shortcomings in one way or another.
To further add to the confusion, most Linux and Unix-based operating systems (Linux, MacOS, etc.) have their own system Python which can easily get fouled up if one is not careful.
This is one place where Java's use of a virtual machine REALLY shines. You can build an uberjar of your application and throw it at a JVM and (barring JNI or non-standard database drivers) it just works. There is also usually no "system Java", so there is nothing to break along those lines.
I see the same problems with the python ecosystem.
There is a lot of tools and confusion between versions especially because of the breaking changes between v2 and v3 (try to explain someone why he has both and when he types python it will take v2.7 as default).
I love the elegance and simplicity of the language and many tools written with it but this is a point I'd really much appreciate to be improved.
Because of that I sometimes just rewrite something before fixing it with 2.7. It's perhaps a bit more work sometimes but not as frustrating as trying to get something running that is deprecated in parts.
It occurs to me that, with respect to version dependency, you can think of Python and Java programming as similar to Smalltalk programming: you program and alter the environment.
In Smalltalk you change parts of the environment. In Python, Java (and Ruby?), you change the entire environment, as described in TFA.
I use mostly JavaScript for my day-to-day web stuff. I've been really turned off from using Python for more things because of these issues. My experience with managing dependencies in JS has been much easier than with Python--I'm really astonished that such a popular language has done such a bad job at this for so long.
I read it as a symptom of a very active project. It’s being taken to new places daily and independently. It might be somewhat like all the various Linux distributions. Somewhat overwhelming from the outside but in practice not so much.
Yes. As a non-Python programmer, I sometime had to make software or dependencies in Python works. It was always a long steps of installing some package manager, setting up virtual environment, running another package manager, etc. And of course, it failed at some point before the all thing was working.
On the contrary, I seldom had these kind of issues with projects coded in C#, C or C++: most of the time the few steps to compile the project succeeded and produced a usable binary.
It's admittedly bad, any Python dev who says otherwise isn't being honest.
That said, once you get it down, you're not burdened by it much/at all. You can start and begin work on a new project in seconds, which feels important for a language that prides itself on ease of use.
I have never understood the need for all the different tools surrounding Python packaging, development environments, or things like Pipenv. For years, I have used Virtualenv and a script to create a virtual environment in my project folder. It's as simple as a node_modules folder, the confusion around it is puzzling to me.
Nowadays, using setuptools to create packages is really easy too, there's a great tutorial on the main Python site. It's not as easy as node.js, sure, but there's tools like Cookiecutter to remove the boilerplate from new packages.
requirements.txt files aren't very elegant, but they work well enough.
And with docker, all this is is even easier. The python + docker story is really nice.
Honestly I just love these basic tools and how they let me do my job without worrying about are they the latest and greatest. My python setup has been stable for years and I am so productive with it.
I'm firmly set on virtualenv with virtualenvwrapper for some convenience functions. Need a new space for a project? mkvirtualenv -p /path/to/python projectname (-p only if I'm not using the default configured in the virtualenv config file, which is rare)
From there it's just "workon projectname" and just "deactivate" when I'm done (or "workon otherprojectname")
It has been stable and working for ages now. I just don't see any strong incentive to change.
Isn't this just a variant of what the original comment is critiquing, though? In order to sanely use Python and external dependencies you need some highly stateful, magical tool that does 'environments' for you. The conceptual load of this is quite high. Adding docker only makes it higher - now you're not merely 'virtualizing' a single language runtime but an entire OS as well - just to deal with the fact your language runtime has trouble handling dependencies.
Last time I looked, it was very tedious to set up `pip` to be secure and pin your dependencies to hashes. Without this, a compromise of a library's pypa account would allow them to execute arbitrary code on your system, assuming you didn't notice the change.
You can use `pip-tools` to get something like a Gemfile/package.json, but there are a few restrictions that are suboptimal.
So Pipenv/Poetry are the current best ways to get something like the package management story that other languages like Ruby and JS have had for a long time (and that Go's been polishing recently too).
You’re absolutely right. I use pipenv almost entirely because it can activate a local environment automatically when entering a folder in terminal. That and the fact that my virtual folder lives in ~/.local let’s me work directly in Dropbox. Nothing I couldn’t live without.
Seriously. The author is bending over backwards to accommodate poetry from every direction, from the stupidest installation instructions I've heard, to "can't transfer from requirements.txt" to "it doesn't work well with docker but doable". Like what exactly does it add that's worth all this complexity? Make you a maitai every hour?
Small python projects I did, I used venv and pip. Learned my lesson through wasting couple of hours after fighting through dependency issues.
Being from JAVA Shop for long time, If I have to switch between different version of JAVA, all I do is change JAVA_HOME to point to correct version, go to base project directory and "mvn clean install" does the job. :).
Eeeehhh I think I will be downvoted to hell and back for this but after I read the article I had the feeling of "why are you making this feel more complex than it needs to be?"
I mean compared to Java and C# I have a MUCH MORE EASIER time to set up my development environment. Installing Python, if I am on a Windows box I mean, is enough to satisfy a lot of the requirements. I then clone the repo of the project and
> "why are you making this feel more complex than it needs to be?"
Because it's more complex if you have projects on multiple Python versions and if you want to lock your Python packages to specific versions. (Pip can bite you when different packages have different requirements for the same lib).
> Although Docker meets all these requirements, I don't really like using it. I find it slow, frustrating, and overkill for my purposes.
How so? I've been using Docker for development for years now and haven't experienced this
EXCEPT with some slowness I experienced with Docker Compose upon upgrading to MacOS Catalina (which
turned out to be bug with PyInstaller, not Docker or Docker Compose). This is on a Mac, btw;
I hear that Docker on Linux is blazing fast.
I personally would absolutely leverage Docker for the setup being described here: multiple versions
with lots of environmental differences between each other. That's what Docker was made for!
I would love to read a blog post covering how to do this!
My experience has been that it's significantly more effort to meet my requirements with Docker, and that I spend a _lot_ of time waiting on Docker builds, or trying to debug finicky issues in my Dockerfile/docker-compose setup.
I'm sure all of these things have fixes -- of course they do! But I find the toolset challenging, and the documentation difficult to follow. I'd love to learn what I'm missing, but I also need to balance that against Getting Shit Done.
The Dockerfile that's provided looks like it would be very slow to build. I always try to make Dockerfiles that install deps and then install my python package (usually just copy in the code and set PYTHONPATH) to fully take advantage of the docker build cache. When you have lots of services it really reduces the time it takes to iterate with `docker-compose up -d --build`-like setups.
In addition to the popular conda, it's worth checking out WinPython for scientific use. Each WinPython installation is an isolated environment that resides in a folder. To move an installation to another computer, just copy the folder. To completely remove it from your system, delete the folder.
I find it useful to keep a WinPython installation on a flash drive in my pocket. I can plug it into somebody's computer and run my own stuff, without worrying that I'm going to bollix up their system.
I've used both and recommend Poetry. It's got a larger feature set (it can be used to manage packages _and_ publish packages), it's faster, and it's more actively developed (measured by releases). Pipenv's last release was 2018-11-26, and Poetry has been publishing pre-releases as recently as three days ago to prepare for v1.0.0.
I did a quick comparison here[0], and I'm planning to do an update with the latest version of Poetry.
Similar to the OP, I found pipenv to be quite unstable. At the time (about a year ago) it was really more interesting beta software than the production-quality software it was advertised as. It was also quite a bit slower than pip.
But what really pushed me away is that installing or upgrading any single package upgraded all dependencies, with no way to disable this behavior. (I believe you can now.) A package manager should help me manage change (and thereby risk), not prevent me from doing so.
Poetry is probably the best of the all-in-one solutions. It does its job well but I've found the documentation lacking.
In the end, I've settled on pyenv-virtualenv to manage my environments and pip-tools to manage dependencies. It's simple and meets my needs.
Pipenv has been nearly completely broken for a year without a release. Installing from scratch rarely works without providing a full path to the python you want to reference.
Now that poetry also manages virtual environments it’s far and away the better choice.
Caveat - Heroku doesn’t understand pyproject files yet, so no native poetry integration. Heroku are working on this.
I switched from pipenv to poetry over 1 year ago. I love it!
The main reasoning was so that I could easily build and publish packages to a private repository and then easily import packages from both pypi and the private repository.
Poetry is amazing, if only for the ability to separate dev and build dependencies. Maybe pipenv does this, but I couldn't get it working well enough to find out.
I've flagged trying to manage python versions on my machine and just develop inside docker containers now (Thanks to VSCode). Using tightly versioned python base images
I chose to not use either after trying both. Nether solves understanding `setup.py` (they are just indirections on it). Of the two, poetry seemed more comprehensive and stable across releases. There’s a small cognitive load of knowing a couple of Twine commands if you don’t use either.
Switched to poetry and couldn't be happier. From my experience, poetry wins hands down. It managed to replace flit, remove duplicate dependencies, and maintain stability across machines. All while using the standard pyproject.toml configuration file.
> On Linux, the system Python is used by the OS itself, so if you hose your Python you can hose your system.
I never manged to hose the OS Python on Linux, by sticking to a very simple rule: DON'T BE ROOT. Don't work as root, don't run `sudo`.
On Linux, I use the system python + virtualenv. Good enough.
When I need a different python version, I use docker (or podman, which is an awesome docker replacement in context of development environments) + virtualenv in the container. (Why virtualenv in the container? Because I use them outside the container as well, and IMHO it can't hurt to be consistent).
I love Python syntax, but I still haven't found a sufficiently popular way that can deploy my code in the same set of setting s as my dev box (other than literally shipping a VM).
So setting up a dev env is one problem, but deploying it so that the prod env is the same and works the same is another.
This article is great, those are viable solutions for sure. One of the alternatives is conda: it's common among data scientists, but many of its features (isolation between environments, you can keep private repository off the internet) meet enterprise needs.
I would generally reach for conda instead of this, but they seem quite comparable in aggregate.
And, given that I've been trying NixOS lately and had loads of trouble and failing to get Conda to work, I will definitely give this setup a try.
(I haven't quite embraced the nix-shell everything solution. It still has trouble with some things. My current workaround is a Dockerfile and a requirements.txt file, which does work...)
I like Python has a language, but when I see how clean are the tools of other similar languages, for example Ruby, compared to the clusterfk of the Python ecosystem, it just make me want to close the terminal. I'm always wondering how it became the language #1 on StackOverflow.
Seconded. Just to be clear asdf manages interpreters, not project dependencies. It actually uses pyenv under the hood to manage Python versions. I use it to manage Elixir and Python versions and it works rather well. I also find its CLI interface well designed and consistent.
There are two things that I find a bit elusive with Python:
1. Highlight to run
2. Remoting into a kernel
Both features are somewhat related. I want to be able to fire up a Python Kernel on a remote server. I want to be able to connect to it easily (not having to ssh tunnel over 6 different ports). I want connect my IDE to it and easily send commands and view data objects remotely. Spyder does all this but its not great. You have to run a custom kernel to be able to view variables locally.
Finally, I want to be able to connect to a Nameko or Flask instance as I would any remote kernel and hot-swap code out as needed.
In my experience, conda breaks quite often. Most recently, conda has changed the location where it stores DLLs (e.g. for PyQt), which broke pyinstaller-based workflows.
In principle, it's a good idea; in practice, I'm not satisfied. On Windows, it's an easy solution, especially for packages that depend on non-python dependencies (e.g. hdf5).
Start with `docker` and learn the basics concepts: difference between image and container, layers, etc. Copy a Python `Dockerfile` and see that it works. After a while you'll get the hang of it and will be able to know what to google and how to navigate the docker manual. Pythonspeed.com has some good protips once you understand the basics.
You'll get confident and from there learning `docker-compose` is a breeze.
I use pip-tools. It fits in nicely as an additional component to the standard toolset (pip and virtualenv). But most people probably do not need to freeze environments so it's great to be able to not use it for most projects.
Hi kovek! You might like https://www.codementor.io/ . I admit I'm a mentor there, and I've made about twenty bucks helping people, but anyway it was super fun helping poeple. :)
With pipx when you install things they go into isolated environments. With pip you're just installing things globally.
This difference is important due to dependencies- if you have two different CLI tools you want to install but they have conflicting dependencies then pip is going to put at least one of them into an unusable state, while pipx will allow them to both coexist on the same system.
> Governance: the lead of Pipenv was someone with a history of not treating his collaborators well. That gave me some serious concerns about the future of the project, and of my ability to get bugs fixed.
Doesn't seem fair. You're not abandoning requests, are you?
Is this comment just trolling? Load the main page of his site.
> I'm a software developer, co-creator of Django, and an experienced engineering leader. I previously ran teams at 18F and Heroku. I'm currently taking new clients through my consultancy, REVSYS.
Please don't break HN's guidelines by being snarky or putting down others' work in a shallow way. If you know more or have a different perspective to offer, try sharing some of what you know so we can all learn something!
marmada|6 years ago
I don't say this out of negativity for the sake of negativity. Earlier today, I was trying to resurrect an old Python project that was using pipenv.
"pipenv install" gave me an error about accepting 1 argument, but 3 were provided.
Then I switched to Poetry. Poetry kept on detecting my Python2 installation, but not my Python3 installation. It seemed like I had to use pyenv, which I didn't want to use, since that's another tool to use and setup on different machines.
I gave up and started rewriting the project (web scraper) in Node.js with Puppeteer.
analog31|6 years ago
What people casually think of as "Python" is really a huge dynamic ecosystem of packages. Imagine that there are 60k packages, each with 1k lines of code... that's a 60 million line codebase, and it can't be checked for breaking changes. Short of continually testing your code against the latest versions of packages, you're going to hit some bumps if you haul an old code out of the vault and try to fire it up on a new system.
I don't know how Javascript developers handle this.
I handle it by running Python inside an isolated environment -- WinPython does this for me -- and occasionally having to fix something if a new version of a package causes a breaking change.
The drawback of my method is deployment -- there is no small nugget of code that I can confidently share with someone. They have to install their own environment for running my stuff, or take their chances, which usually ends badly.
dec0dedab0de|6 years ago
That said, I 100% agree. I don't have the answer, except that I wish that there was one official answer, for developers and deployment that was easy to explain to beginners.
For what it's worth I've been using pipenv for over a year and it works good enough. I think npm's approach is better, but not perfect. I've heard good things about yarn. I know CPAN is the grandfather of all of them. I've barely used gems but they seem like magic, and Go get your repo URI out of my code please and thank you. :-) All fun aside, what languages have it right? and is there maybe a way to come up with a unified language dependency manger?
whateveracct|6 years ago
Everything is pinned to exact source revisions. You can be relatively sure to be able to git clone and nix-shell and be off to the races.
You can even go the extra mile and provide working editor integration in your nix-shell (especially easy with emacs). So you can enable anyone to git clone, nix-shell, and open a working editor.
The biggest downside is becoming proficient in Nix isn't easy or even straightforward.
angrygoat|6 years ago
In Python it's a mess. Some packages specify their deps in setup.py; some in a requirements file, which may or may not be read in by their setup.py. It's not rare to have to have multiple 'pip install' commands to get a working environment, especially when installing fairly boutique extensions to frameworks like Django.
There just isn't a holistic, opinionated approach to specifying dependencies, especially at a project level. Which leaves unexpected upgrades of dependencies (occasionally leading to regressions) as a reality for Python devs.
lone_haxx0r|6 years ago
npm sucks and all, but at least it just works and doesn't get in my way as much.
zmmmmm|6 years ago
kstenerud|6 years ago
Like any dictatorship, that can be fine if those in charge are benevolent and competent. For programming languages, the first is almost always true, but the second is far from guaranteed. Skill at programming language development has no bearing on skill at developing a build, packaging, and distribution system.
Go is a prime example of this. The language is pleasant to use with few ugly surprises, but their build system has been awful for a decade, only now reaching a semi-decent state with modules (which are still pretty damn ugly).
With python, on the other hand, there's competition in this space, and as a result the tools are pretty nice, albeit fragmented.
But then there's rust, which has a nice language AND a nice build system. You take a big risk when building both the language and build system; sometimes it works, sometimes it doesn't. And you risk fragmentation if you don't. It's a tough choice.
TheChaplain|6 years ago
python3 -m venv /tmp/foo
/tmp/foo/bin/pip -U pip wheel
/tmp/foo/bin/pip -r requirements.txt
I understand some might not like it, but really, it's simple and it works.
dmurray|6 years ago
Tkinter and mpi4py are the most recent ones I've had this problem with. I expect someone will tell me "it's trivial to install these in a venv, just do X", but X is not obvious to me.
Evidlo|6 years ago
reledi|6 years ago
tanilama|6 years ago
This post reads most to me as fad. It is an opinion, you would most likely bail out.
fiedzia|6 years ago
It just shows that Python is used for a lot of purposes and there is no single tool that handles all usecases.
> I gave up and started rewriting the project (web scraper) in Node.js with Puppeteer.
And when you'll get to the point where you need to work with different node projects, you will also need several tools to manage node versions and different environments, so that doesn't help at all.
twblalock|6 years ago
packetslave|6 years ago
* "I need to develop against multiple Python versions - various Python 3 versions (3.6, 3.7, 3.8, mostly), PyPy, and occasionally Python 2.7 (less and less often, thankfully)."
* "I work on many projects simultaneously, each with different sets of dependencies, so some sort of virtual environment or isolation is critical."
Rotareti|6 years ago
poetry == npm
pyenv == nvm
pipx == npx
No big difference, IMO.
globular-toast|6 years ago
thesuperbigfrog|6 years ago
Since Python did not prescribe a way to handle it the community has invented multiple competing ways to solve the problem, most of which have shortcomings in one way or another.
To further add to the confusion, most Linux and Unix-based operating systems (Linux, MacOS, etc.) have their own system Python which can easily get fouled up if one is not careful.
This is one place where Java's use of a virtual machine REALLY shines. You can build an uberjar of your application and throw it at a JVM and (barring JNI or non-standard database drivers) it just works. There is also usually no "system Java", so there is nothing to break along those lines.
richardARPANET|6 years ago
VvR-Ox|6 years ago
There is a lot of tools and confusion between versions especially because of the breaking changes between v2 and v3 (try to explain someone why he has both and when he types python it will take v2.7 as default).
I love the elegance and simplicity of the language and many tools written with it but this is a point I'd really much appreciate to be improved.
Because of that I sometimes just rewrite something before fixing it with 2.7. It's perhaps a bit more work sometimes but not as frustrating as trying to get something running that is deprecated in parts.
a3n|6 years ago
In Smalltalk you change parts of the environment. In Python, Java (and Ruby?), you change the entire environment, as described in TFA.
https://www.quora.com/What-is-the-essence-of-Smalltalk
coleifer|6 years ago
behnamoh|6 years ago
tvanantwerp|6 years ago
ianai|6 years ago
unknown|6 years ago
[deleted]
tasogare|6 years ago
On the contrary, I seldom had these kind of issues with projects coded in C#, C or C++: most of the time the few steps to compile the project succeeded and produced a usable binary.
diminoten|6 years ago
That said, once you get it down, you're not burdened by it much/at all. You can start and begin work on a new project in seconds, which feels important for a language that prides itself on ease of use.
But yeah, not a great look for newcomers.
madelyn|6 years ago
Nowadays, using setuptools to create packages is really easy too, there's a great tutorial on the main Python site. It's not as easy as node.js, sure, but there's tools like Cookiecutter to remove the boilerplate from new packages.
requirements.txt files aren't very elegant, but they work well enough.
And with docker, all this is is even easier. The python + docker story is really nice.
Honestly I just love these basic tools and how they let me do my job without worrying about are they the latest and greatest. My python setup has been stable for years and I am so productive with it.
Twirrim|6 years ago
From there it's just "workon projectname" and just "deactivate" when I'm done (or "workon otherprojectname")
It has been stable and working for ages now. I just don't see any strong incentive to change.
pvg|6 years ago
Isn't this just a variant of what the original comment is critiquing, though? In order to sanely use Python and external dependencies you need some highly stateful, magical tool that does 'environments' for you. The conceptual load of this is quite high. Adding docker only makes it higher - now you're not merely 'virtualizing' a single language runtime but an entire OS as well - just to deal with the fact your language runtime has trouble handling dependencies.
theptip|6 years ago
You can use `pip-tools` to get something like a Gemfile/package.json, but there are a few restrictions that are suboptimal.
So Pipenv/Poetry are the current best ways to get something like the package management story that other languages like Ruby and JS have had for a long time (and that Go's been polishing recently too).
heyoni|6 years ago
j88439h84|6 years ago
Evidlo|6 years ago
ramraj07|6 years ago
ph2082|6 years ago
Being from JAVA Shop for long time, If I have to switch between different version of JAVA, all I do is change JAVA_HOME to point to correct version, go to base project directory and "mvn clean install" does the job. :).
Grue3|6 years ago
globular-toast|6 years ago
Need to have multiple versions of python installed and easily accessible? Use pyenv.
Need to run tests across multiple versions of python? Use tox.
Need to freeze environments for deployment purposes? Use pip-tools.
Need to freeze the entire operating system? Use docker or vagrant.
Don't use tools you don't need. That would be silly.
thesuperbigfrog|6 years ago
However, you can run into issues if you are using different versions of Python, or Python on different operating systems.
ropans808|6 years ago
j88439h84|6 years ago
aequitas|6 years ago
f4stjack|6 years ago
I mean compared to Java and C# I have a MUCH MORE EASIER time to set up my development environment. Installing Python, if I am on a Windows box I mean, is enough to satisfy a lot of the requirements. I then clone the repo of the project and
source venv/bin/activate
pip install -r requirements.txt
is enough to get me to start coding.
slig|6 years ago
Because it's more complex if you have projects on multiple Python versions and if you want to lock your Python packages to specific versions. (Pip can bite you when different packages have different requirements for the same lib).
wp381640|6 years ago
https://github.com/sdispater/poetry/issues/571
the OP himself has a fix for this in his own dotfiles repo:
https://github.com/jacobian/dotfiles/commit/e7889c5954daacfe...
jacobian|6 years ago
(Though, how'd you find that? Mildly creepy that you know more about my dotfiles than I do!)
nunez|6 years ago
How so? I've been using Docker for development for years now and haven't experienced this EXCEPT with some slowness I experienced with Docker Compose upon upgrading to MacOS Catalina (which turned out to be bug with PyInstaller, not Docker or Docker Compose). This is on a Mac, btw; I hear that Docker on Linux is blazing fast.
I personally would absolutely leverage Docker for the setup being described here: multiple versions with lots of environmental differences between each other. That's what Docker was made for!
jacobian|6 years ago
My experience has been that it's significantly more effort to meet my requirements with Docker, and that I spend a _lot_ of time waiting on Docker builds, or trying to debug finicky issues in my Dockerfile/docker-compose setup.
I'm sure all of these things have fixes -- of course they do! But I find the toolset challenging, and the documentation difficult to follow. I'd love to learn what I'm missing, but I also need to balance that against Getting Shit Done.
jsmeaton|6 years ago
gravypod|6 years ago
analog31|6 years ago
I find it useful to keep a WinPython installation on a flash drive in my pocket. I can plug it into somebody's computer and run my own stuff, without worrying that I'm going to bollix up their system.
ninetax|6 years ago
franey|6 years ago
I did a quick comparison here[0], and I'm planning to do an update with the latest version of Poetry.
[0] https://johnfraney.ca/posts/2019/03/06/pipenv-poetry-benchma...
luhn|6 years ago
But what really pushed me away is that installing or upgrading any single package upgraded all dependencies, with no way to disable this behavior. (I believe you can now.) A package manager should help me manage change (and thereby risk), not prevent me from doing so.
Poetry is probably the best of the all-in-one solutions. It does its job well but I've found the documentation lacking.
In the end, I've settled on pyenv-virtualenv to manage my environments and pip-tools to manage dependencies. It's simple and meets my needs.
jsmeaton|6 years ago
Now that poetry also manages virtual environments it’s far and away the better choice.
Caveat - Heroku doesn’t understand pyproject files yet, so no native poetry integration. Heroku are working on this.
kndjckt|6 years ago
The main reasoning was so that I could easily build and publish packages to a private repository and then easily import packages from both pypi and the private repository.
Happy to answer more questions.
ryall|6 years ago
I've flagged trying to manage python versions on my machine and just develop inside docker containers now (Thanks to VSCode). Using tightly versioned python base images
nerdwaller|6 years ago
timothycrosley|6 years ago
snypox|6 years ago
diminoten|6 years ago
That said, all I want is for a unified standard to emerge, this is getting a little ridiculous...
perlgeek|6 years ago
I never manged to hose the OS Python on Linux, by sticking to a very simple rule: DON'T BE ROOT. Don't work as root, don't run `sudo`.
On Linux, I use the system python + virtualenv. Good enough.
When I need a different python version, I use docker (or podman, which is an awesome docker replacement in context of development environments) + virtualenv in the container. (Why virtualenv in the container? Because I use them outside the container as well, and IMHO it can't hurt to be consistent).
fock|6 years ago
[deleted]
xchaotic|6 years ago
ausjke|6 years ago
snypox|6 years ago
sjustinas|6 years ago
Lucasoato|6 years ago
eximius|6 years ago
And, given that I've been trying NixOS lately and had loads of trouble and failing to get Conda to work, I will definitely give this setup a try.
(I haven't quite embraced the nix-shell everything solution. It still has trouble with some things. My current workaround is a Dockerfile and a requirements.txt file, which does work...)
rullopat|6 years ago
notus|6 years ago
pritambaral|6 years ago
rhizome31|6 years ago
globular-toast|6 years ago
anonu|6 years ago
1. Highlight to run 2. Remoting into a kernel
Both features are somewhat related. I want to be able to fire up a Python Kernel on a remote server. I want to be able to connect to it easily (not having to ssh tunnel over 6 different ports). I want connect my IDE to it and easily send commands and view data objects remotely. Spyder does all this but its not great. You have to run a custom kernel to be able to view variables locally.
Finally, I want to be able to connect to a Nameko or Flask instance as I would any remote kernel and hot-swap code out as needed.
luord|6 years ago
I gotta give poetry a try, though.
eivarv|6 years ago
cosmic_quanta|6 years ago
In principle, it's a good idea; in practice, I'm not satisfied. On Windows, it's an easy solution, especially for packages that depend on non-python dependencies (e.g. hdf5).
MasterScrat|6 years ago
nsomaru|6 years ago
Can anyone comment on the Docker learning & troubleshooting story for python?
maksimum|6 years ago
slig|6 years ago
You'll get confident and from there learning `docker-compose` is a breeze.
unknown|6 years ago
[deleted]
snorkasaurus|6 years ago
globular-toast|6 years ago
frou_dh|6 years ago
It feels pretty comfy to effectively be on an island and far away from the hustle and bustle of the industrial Python tooling.
dang|6 years ago
schainks|6 years ago
kovek|6 years ago
abcininin|6 years ago
1. Install Anaconda to your home user directory .
2. create environment using (conda create --name myenv python=3.6) .
3. Switch to the environment using (conda activate myenv) .
4. Use (conda install mypackage), (pip install mypackage) in that priority order .
5. Export environment using (conda env export > conda_env.yaml) .
6. Environment can be created on an other system using (conda env create -f conda_env.yaml) .
Anaconda: https://www.anaconda.com/distribution/#download-section .
Dockerized Anaconda: https://docs.anaconda.com/anaconda/user-guide/tasks/docker/ .
paulproteus|6 years ago
slig|6 years ago
SpaceL10n|6 years ago
mrfusion|6 years ago
tedivm|6 years ago
This difference is important due to dependencies- if you have two different CLI tools you want to install but they have conflicting dependencies then pip is going to put at least one of them into an unusable state, while pipx will allow them to both coexist on the same system.
diminoten|6 years ago
Doesn't seem fair. You're not abandoning requests, are you?
oefrha|6 years ago
Edit: From https://www.python.org/psf/github/,
> ... we have created a GitHub organization, @psf, to support and protect projects that have outgrown ownership by their original author.
maw|6 years ago
radicallib|6 years ago
[deleted]
Klonoar|6 years ago
Is this comment just trolling? Load the main page of his site.
> I'm a software developer, co-creator of Django, and an experienced engineering leader. I previously ran teams at 18F and Heroku. I'm currently taking new clients through my consultancy, REVSYS.
packetslave|6 years ago
mlthoughts2018|6 years ago
dang|6 years ago
whalesalad|6 years ago
2. Conda is not used as much as you might think... it's really only used within the data science community.
misnome|6 years ago
unknown|6 years ago
[deleted]