top | item 45466897

(no title)

charliermarsh | 5 months ago

Lazy imports have been proposed before, and were rejected most recently back in 2022: https://discuss.python.org/t/pep-690-lazy-imports-again/1966.... If I recall correctly, lazy imports are a feature supported in Cinder, Meta's version of CPython, and the PEP was driven by folks that worked on Cinder. Last time, a lot of the discussion centered around questions like: Should this be opt-in or opt-out? At what level? Should it be a build-flag for CPython itself? Etc. The linked post suggests that the Steering Council ultimately rejected it because of the complexity it would introduce to have two divergent "modes" of importing.

I hope this proposal succeeds. I would love to use this feature.

discuss

order

BiteCode_dev|5 months ago

Especially since it is opt in, with various level of granularity, and a global off switch. Very well constructed spec given the constraints.

flare_blitz|5 months ago

I also hope this proposal succeeds, but I'm not optimistic. This will break tons of code and introduce a slew of footguns. Import statements fundamentally have side effects, and when and how these side effects are applied will cause mysterious breakages that will keep people up for many nights.

This is not fearmongering. There is a reason why the only flavor of Python with lazy imports comes from Meta, which is one of the most well-resourced companies in the world.

Too many people in this thread hold the view of "importing {pandas, numpy, my weird module that is more tangled than an eight-player game of Twister} takes too long and I will gladly support anything that makes them faster". I would be willing to bet a large sum of money that most people who hold this opinion are unable to describe how Python's import system works, let alone describe how to implement lazy imports.

PEP 690 describes a number of drawbacks. For example, lazy imports break code that uses decorators to add functions to a central registry. This behavior is crucial for Dash, a popular library for building frontends that has been around for more than a decade. At import-time, Dash uses decorators to bind a JavaScript-based interface to callbacks written in Python. If these imports were made lazy, Dash would break. Frontends used by thousands, if not millions of people, would immediately become unresponsive.

You may cry, "But lazy imports are opt-in! Developers can choose to opt-out of lazy imports if it doesn't work for them." What if these imports were transitive? What if our frontend needed to be completely initialized before starting a critical process, else it would cause a production outage? What if you were a maintainer of a library that was used by millions of people? How could you be sure that adding lazy imports wouldn't break any code downstream? Many people made this argument for type hints, which is sensible because type hints have no effect on runtime behavior*. This is not true for lazy imports; import statements exist in essentially every nontrivial Python program, and changing them to be lazy will fundamentally alter runtime behavior.

This is before we even get to the rest of the issues the PEP describes, which are even weirder and crazier than this. This is a far more difficult undertaking than many people realize.

---

* You can make a program change its behavior based on type annotations, but you'd need to explicitly call into typing APIs to do this. Discussion about this is beyond the scope of this post.

ndrezn|4 months ago

Product manager for Dash here At Plotly we're actually pretty excited about the potential for lazy loaded imports, as it could help out a lot with the import performance of Plotly.py.

As this comment mentions Dash apps would not support lazy loaded imports until the underlying Dash library changes how it loads in callbacks and component libraries (the two features which would be most impacted here), but that doesn't mean there's no path to success. We've been discussing some ways we could resolve this internally and if this PEP is accepted we'd certainly go further to see if we can fully support lazy loaded imports (of both the Dash library itself/Dash component libraries and for relative imports in Dash apps).

fastasucan|5 months ago

They are not entitled to hold the opinion that their imports takes too long, if they dont know the inner workings of pythons import system? Do you listen to yourself?

f33d5173|5 months ago

This is a new syntax, so it is opt-in. The new syntax can be conceived as syntax sugar that lets you rewrite

  def my_func():
      import my_mod
      my_mod.do_stuff()
as

  lazy import my_mod
  def my_func():
      my_mod.do_stuff()

Ie, with lazy, the import happens at the site of usage. Since clearly this is code that could already be written, it only breaks things in the sense that someone could already write broken code. Since it is opt in, if using it breaks some code, then people will notice that and choose not to rewrite that code using it.

WD-42|5 months ago

Some of these worries make sense, but wouldn’t it be relatively trivial to pass a flag to the interpreter or something similar in order to force all imports to evaluate, as in the current behavior? But to be a bit cheeky if some of these issues cause serious production outages for you it might be time to consider moving on from a scripting language altogether.

zahlman|5 months ago

> This will break tons of code

I don't see how. It adds a new, entirely optional syntax using a soft keyword. The semantics of existing code do not change. Yes, yes, you anticipated the objection:

> What if these imports were transitive? ... How could you be sure that adding lazy imports wouldn't break any code downstream?

I would need to see concrete examples of how this would be a realistic risk in principle. (My gut reaction is that top-level code in libraries shouldn't be doing the kinds of things that would be problematic here, in the first place. In my experience, the main thing they do at top level is just eagerly importing everything else for convenience, or to establish compatibility aliases.)

But if it were, clearly that's a breaking change, and the library bumps the major version and clients do their usual dependency version management. As you note, type hints work similarly. And "explicitly calling into typing APIs" is more common than you might think; https://pypistats.org/packages/pydantic exists pretty much to do exactly this. It didn't cause major problems.

> Import statements fundamentally have side effects, and when and how these side effects are applied will cause mysterious breakages that will keep people up for many nights.

They do have side effects that can be arbitrarily complex. But someone who opts in to changing import timing and encounters a difficult bug can just roll back the changes. It shouldn't cause extended debugging sessions unless someone really needs the benefits of the deferral. And people in that situation will have been hand-rolling their own workarounds anyway.

> Too many people in this thread hold the view of "importing {pandas, numpy, my weird module that is more tangled than an eight-player game of Twister} takes too long and I will gladly support anything that makes them faster".

I don't think they're under the impression that this necessarily makes things faster. Maybe I haven't seen the same comments you have.

Deferring imports absolutely would allow, for example, pip to do trivial tasks faster — because it could avoid importing unnecessary things at all. As things currently stand, a huge fraction of the vendored codebase will get imported pretty much no matter what. It's analogous to tree shaking, but implicitly, at runtime and without actually removing code.

Yes, this could be deferred to explicitly chosen times to get more or less the same benefit. It would also be more work.

dheera|5 months ago

Oof. I wish they could support version imports

    import torch==2.6.0+cu124
    import numpy>=1.2.6
and support having multiple simultaneous versions of any Python library installed. End this conda/virtualenv/docker/bazel/[pick your poison] mess

zahlman|5 months ago

It's been explained many times before why this is not possible: the library doesn't actually have a version number. The distribution of source code on PyPI has a version number, but the name of this is not connected to the name of any module or package you import in the source code. The distribution can validly define zero or more modules (packages are a subset of modules, represented using the same type in the Python type system).

You got three other responses before me all pointing at uv. They are all wrong, because uv did not introduce this functionality to the Python ecosystem. It is a standard defined by https://peps.python.org/pep-0723/, implemented by multiple other tools, notably pipx.

maxloh|5 months ago

You could do that with uv.

  # /// script
  # dependencies = [
  #   "requests<3",
  #   "rich",
  # ]
  # ///
  
  import requests
  from rich.pretty import pprint
  
  resp = requests.get("https://peps.python.org/api/peps.json")
  data = resp.json()
  pprint([(k, v["title"]) for k, v in data.items()][:10])

oivey|5 months ago

Oof. This feature request has nothing to do with lazy imports. It’s also solved far more cleanly with inline script metadata.

gigatexal|5 months ago

Really what is the headache with virtual environments? They’ve been solved. Use UV or python’s built in venv creator and you’re good to go.

uv venv —seed —python=3.12 && source .venv/bin/activate && pip3 install requests && …

fouronnes3|5 months ago

The mess has ended thanks to uv.

_ZeD_|5 months ago

NO! I don't want my source code filled with this crap.

I don't want to lose multiple hours debugging why something did go wrong because I am using three versions of numpy and seven of torch at the same time and there was a mixup

lblume|5 months ago

uv is good.

tomyhsieh|5 months ago

Just curious. What changed?

From merely browsing through a few comments, people have mostly positive opinions regarding this proposal. Then why did it fail many times, but not this time? What drives the success behind this PEP?