Pillow has most of the issues that are listed in the article. (oddly enough for a graphics library, the GPU part is the only part that I don't think we've stumbled over at one point or another.)
From a quality of life issue -- having the sdist install behind an opt-in flag by default for our package would be great. Unless you're a developer with a lot of -dev packages for imaging libraries already on your system, you're not going to be able to build from source. And even if the error that pops up is along the lines of
The headers or library files could not be found for {str(err)},
a required dependency when compiling Pillow from source.
Please see the install instructions at:
https://pillow.readthedocs.io/en/latest/installation.html
We still get people posting issues about pillow failing to install.
Build farms would be nice. We've burned tons of time on it between travis and GH actions and @cgohlke single handedly making all of the windows builds for the entire scientific python community.
Ultimately, something like the debian packaging system is probably the best open source model for this. (though the splitting of the python standard library so that virtual envs aren't in the base install is a pita). Unstable gets a reasonably current set of packages, and crucially all of the underlying library dependencies are compiled together. It's also not _that_ hard to rebuild individual packages from source, in an automated fashion. (This may be what Conda is doing, but I've never looked in detail at their system)
I've been planning on packaging a python package recently, and the internet is annoyingly full of guides which are, I think, out of date. They at least suggest quite different things.
I just have a single python file, meant to be treated as an executable (no package at present). There are a whole bunch of tests, but that's obviously separate. Any suggestions on modern best practices welcome!
If it's pure Python, the only packaging file you need is `pyproject.toml`. You can fill that file with packaging metadata per PEP 518 and PEP 621, including using modern build tooling like flit[1] for the build backend and build[2] for the frontend.
With that, you entire package build (for all distribution types) should be reducible to `python -m build`. Here's an example of a full project doing everything with just `pyproject.toml`[3] (FD: my project).
If you are wanting to release it to pypi as a python package, I would personally use Poetry. But your case- a single pure Python package, is a simple case that won't have many problems like are brought up in the article, whatever tool you use.
If you want a stand alone executable, I haven't found a good, single, cross platform tool for that yet... seems like there is a separate tool for each platform.
As detailed in the other answers, there are two parts to this: 1) Creating a python package from your project (and possibly share this on pypi), and 2) Making this package available as an end-user application.
For step 2 you can use nuitka or similar, but if your audience is somewhat developer-oriented, you can also propose for them to use pipx: https://github.com/pypa/pipx.
This is an incredible example of organizing information well and making a case to a wide audience. It's difficult enough to shave all the yaks necessary to get a high-level view of all issues related to a problem, and to express all those problems in good writing is an additional tough challenge. These folks have done an amazing job at both.
With modern tooling packaging pure python code to be used by other python developers is a relatively painless process.
The main problem with python packaging is that it's often C/C++ packaging in disguise, among multiple OSes and CPU architectures, and that's far from being solved. Building such python wheel is essentially like building a "portable" (aka one you don't need to properly install into the system) linux/windows/macos application. That comes with a variety of caveats and requires some specialized knowledge one wouldn't pick up playing around with just python alone.
Is there any consensus on how to deal with packaging and environments in Python by now? Can you suggest me some tutorial for that?
I've been out of the loop for a long time, and would like to get an update on how things are in Python in 2023, but I'm not sure if there even is a consensus — what I can find by googling seems to be several kinda competing approaches. This seems surprising, because most "modern" languages seem to have a very well defined set of practices to deal with all of that stuff. Some languages already come with their built-in stuff (Go, Rust), others simply have well-known solutions (like, technically there still exist PEAR and PECL for PHP, but everyone just knows how to use composer, which solves both packaging and dependency-management problems, and it's also pretty clear what problems it doesn't solve).
For Python there seems to be like a dozen of tools and I'm not sure which are outdated and not used by anyone, which are useless fancy wrappers (not used by anyone) and what is the actual go-to tool (if there is any) for all common cases. Dependency-management, version locking, shipping an executable, environment separation for some local scripts, should I even ever use pip install globally, etc.
and the longer story is that this method has the flexibility to allow other implementations of packaging tools to be used, and so it fosters choice and competition in the ecosystem. In contrast, the old method of packaging was tied to a particular implementation.
From the sibling thread about packaging and deploying a single script, there was no consensus. There was disagreement on the best way to package, and doubts about the mid term future of some suggested solutions. The following alternatives were suggested:
- package with a `pyproject.toml` file configured to use modern tooling
- package with a `pyproject.toml` file configured to use traditional `setup.py` tooling
- package with traditional `setup.py` tooling
- package with poetry
- package with whatever, deploy with nuikta or pipx
- skip the packaging and deploy with Pyinstaller
- skip the packaging and deploy with nikta
Note that, unless the Python world has radically changed while I was looking away, the packaging does not ensure a simple way to deploy the package and its single script. I remember vividly `pipenv` crashing on me, so switching to venv+pip (or was it virtualenv+pip?) then setting up a bash wrapper to call the Python script with the right venv...
I don't think there's a consensus, but there are some good modern options. I think poetry is a good choice, and it seems to be fairly popular. I use it for all my Python projects and haven't found a compelling reason to switch to another option in the past few years.
As to the question of whether you should ever use pip to install packages globally, the answer is almost always no. For command line tools, the best option IMO is pipx. The second best option is pip install --user.
If you're developing a library or application, you should always isolate it in a virtualenv, which is something poetry will handle for you when you run `poetry install`.
Do you know of any up to date blogs/howtos/guides on nix+python where the python project contains modules that need to be compiled (eg Cython, pybind, etc)? I've found the basic info at https://nixos.wiki/wiki/Packaging/Python but it doesn't really go in depth for more complex use cases than having a setup.py...
Python packaging gets a lot of criticism. It's a meme. The thing is, it's actually improved dramatically over the years and continues to improve.
The problems it solves are very complex if one looks a little below the surface. It is solving different problems to the ecosystems that it's often compared to: golang, rust, java, js.
That's ... completely missing the point of the article. There's nothing about competing standards that solves the problem of the C-ABI and how to package non-python library dependencies.
At this point, there are 10 competing tools but no longer so many competing standards: the standards for Python packaging from 2015 onwards are PEP 517 (build system standardization), PEP 518 (using pyproject.toml to configure the build system), and PEP 621 (storing project metadata, previously standardized, in pyproject.toml). These standards build on top of each other, meaning that they don't offer conflicting advice.
The TL;DR for Python packaging in 2022 is that, unless you're building CPython extensions, you can do everything through pyproject.toml with PyPA-maintained tooling.
Recently just migrate a project from pypoetry away to the traditional setup method. Poetry works great for simple package, but once you started to add in complexities, it just falls apart due to everything was abstract away and simplified into config files and command line.
wiredfool|3 years ago
From a quality of life issue -- having the sdist install behind an opt-in flag by default for our package would be great. Unless you're a developer with a lot of -dev packages for imaging libraries already on your system, you're not going to be able to build from source. And even if the error that pops up is along the lines of
We still get people posting issues about pillow failing to install.Build farms would be nice. We've burned tons of time on it between travis and GH actions and @cgohlke single handedly making all of the windows builds for the entire scientific python community.
Ultimately, something like the debian packaging system is probably the best open source model for this. (though the splitting of the python standard library so that virtual envs aren't in the base install is a pita). Unstable gets a reasonably current set of packages, and crucially all of the underlying library dependencies are compiled together. It's also not _that_ hard to rebuild individual packages from source, in an automated fashion. (This may be what Conda is doing, but I've never looked in detail at their system)
david2ndaccount|3 years ago
CJefferson|3 years ago
I just have a single python file, meant to be treated as an executable (no package at present). There are a whole bunch of tests, but that's obviously separate. Any suggestions on modern best practices welcome!
woodruffw|3 years ago
With that, you entire package build (for all distribution types) should be reducible to `python -m build`. Here's an example of a full project doing everything with just `pyproject.toml`[3] (FD: my project).
[1]: https://github.com/pypa/flit
[2]: https://github.com/pypa/build
[3]: https://github.com/pypa/pip-audit
ensignavenger|3 years ago
If you want a stand alone executable, I haven't found a good, single, cross platform tool for that yet... seems like there is a separate tool for each platform.
mixmastamyk|3 years ago
japanuspus|3 years ago
For step 2 you can use nuitka or similar, but if your audience is somewhat developer-oriented, you can also propose for them to use pipx: https://github.com/pypa/pipx.
tpoacher|3 years ago
e.g. https://github.com/tpapastylianou/self-contained-runnable-py...
irskep|3 years ago
Shoutout to Material for MkDocs enabling the swanky theme and Markdown extensions. https://squidfunk.github.io/mkdocs-material/
deniska|3 years ago
The main problem with python packaging is that it's often C/C++ packaging in disguise, among multiple OSes and CPU architectures, and that's far from being solved. Building such python wheel is essentially like building a "portable" (aka one you don't need to properly install into the system) linux/windows/macos application. That comes with a variety of caveats and requires some specialized knowledge one wouldn't pick up playing around with just python alone.
krick|3 years ago
I've been out of the loop for a long time, and would like to get an update on how things are in Python in 2023, but I'm not sure if there even is a consensus — what I can find by googling seems to be several kinda competing approaches. This seems surprising, because most "modern" languages seem to have a very well defined set of practices to deal with all of that stuff. Some languages already come with their built-in stuff (Go, Rust), others simply have well-known solutions (like, technically there still exist PEAR and PECL for PHP, but everyone just knows how to use composer, which solves both packaging and dependency-management problems, and it's also pretty clear what problems it doesn't solve).
For Python there seems to be like a dozen of tools and I'm not sure which are outdated and not used by anyone, which are useless fancy wrappers (not used by anyone) and what is the actual go-to tool (if there is any) for all common cases. Dependency-management, version locking, shipping an executable, environment separation for some local scripts, should I even ever use pip install globally, etc.
tlocke|3 years ago
https://packaging.python.org/en/latest/tutorials/packaging-p...
and the longer story is that this method has the flexibility to allow other implementations of packaging tools to be used, and so it fosters choice and competition in the ecosystem. In contrast, the old method of packaging was tied to a particular implementation.
idoubtit|3 years ago
- package with a `pyproject.toml` file configured to use modern tooling
- package with a `pyproject.toml` file configured to use traditional `setup.py` tooling
- package with traditional `setup.py` tooling
- package with poetry
- package with whatever, deploy with nuikta or pipx
- skip the packaging and deploy with Pyinstaller
- skip the packaging and deploy with nikta
Note that, unless the Python world has radically changed while I was looking away, the packaging does not ensure a simple way to deploy the package and its single script. I remember vividly `pipenv` crashing on me, so switching to venv+pip (or was it virtualenv+pip?) then setting up a bash wrapper to call the Python script with the right venv...
saila|3 years ago
As to the question of whether you should ever use pip to install packages globally, the answer is almost always no. For command line tools, the best option IMO is pipx. The second best option is pip install --user.
If you're developing a library or application, you should always isolate it in a virtualenv, which is something poetry will handle for you when you run `poetry install`.
miohtama|3 years ago
Sdist is only one letter away from sadist.
ris|3 years ago
physPop|3 years ago
egberts1|3 years ago
https://pypi.org/project/xkcd2347/
gdprrrr|3 years ago
0x008|3 years ago
groodt|3 years ago
The problems it solves are very complex if one looks a little below the surface. It is solving different problems to the ecosystems that it's often compared to: golang, rust, java, js.
wiredfool|3 years ago
woodruffw|3 years ago
The TL;DR for Python packaging in 2022 is that, unless you're building CPython extensions, you can do everything through pyproject.toml with PyPA-maintained tooling.
cozzyd|3 years ago
hot_gril|3 years ago
unknown|3 years ago
[deleted]
optimalsolver|3 years ago
https://python-poetry.org/
blackcat201|3 years ago
ris|3 years ago