top | item 41698995

Generate pip requirements.txt file based on imports of any project

121 points| mooreds | 1 year ago |github.com | reply

88 comments

order
[+] bndr|1 year ago|reply
Oh wow, my package on the front page again. Glad that it's still being used.

This was written 10 years ago when I was struggling with pulling and installing projects that didn't have any requirements.txt. It was frustrating and time-consuming to get everything up and running, so I decided to fix it, apparently many other developers had the same issue.

[Update]: Though I do think the package is already at a level where it does one thing and it does it good. I'm still looking for maintainers to improve it and move it forward.

[+] joshdavham|1 year ago|reply
> This was written 10 years ago when I was struggling with pulling and installing projects that didn't have any requirements.txt

And 10 years later this is still a common problem!

[+] idoubtit|1 year ago|reply
I used this last year, with a relative success. I was asked to fix a Python code written by an intern that was no longer there. The code used external libraries, but did not declare any. I had only a zip of the source code, without any library. pipreqs was able to identify the 22 required libraries. Unfortunately, there was a runtime crash because a library was at the wrong version. So I had to let a real Python developer handle the next steps.

BTW, this tool is not a dependency manager. Many sibling comments seem to misunderstand the case and promote unrelated tools.

[+] remram|1 year ago|reply
Can I have it find the requirements as of a specific date?

If I find a script/notebook/project with no requirements.txt, I usually know when it was created. Being able to get the versions that were selected back then on the author's machine would be great for reproducibility.

[+] matrss|1 year ago|reply
For the future, please pick a package manager that can give you a lock file alongside your code, so that you have a definitive record of the dependencies.

Even if you have all versions as of the time of the last modification to the code, you don't know if the dependency resolution happened at that point in time, or if the environment was set up years prior and never updated.

Nevertheless, this is what you are looking for: https://pypi.org/project/pypi-timemachine/

[+] 0cf8612b2e1e|1 year ago|reply
I have never realized until this moment that I want such a tool. Ideally, you would never find yourself in this position, but alas.
[+] sean_pedersen|1 year ago|reply
I thought import names and PyPI names are not always equal, thus this can not work reliably, right?
[+] halfcat|1 year ago|reply
I don’t know if this is how pipreqs works, but I’d be concerned about a typo in an import that inadvertently installs a library.

I’ve found pip-tools [1] to be a nice middle ground between `pip freeze` and something heavier like poetry. It’s especially helpful in showing you where a library came from, when a library installs other libraries it depends upon.

One could still have a typo in their requirements.in file with pip-tools, but changes there are much less frequent than imports in any Python file in your proejcf which could be a daily occurrence in some codebases.

[1] https://github.com/jazzband/pip-tools

[+] thebigspacefuck|1 year ago|reply
Would this be useful for identifying packages in a requirements.txt that aren’t used? Or is there another tool for that?
[+] xuraiis|1 year ago|reply
Awesome project! Didn’t realise this existed when I took a stab at implementing a similar thing a few months back: https://github.com/nmichlo/pydependence using graph traversal of imports as well as naive support for lazy imports to generate optional extras as well as modify the pyproject.toml as part of pre-commit hooks.

Edit: the main goal was to generate optional extras for different entry points in a large codebase, find missing or extra imports, resolve dependencies across multiple repos, and see which files reference the mapped packages. Ideally if you have many internal repos which are not published and you cannot correctly use dependency resolution, then you can generate requirements before you pass them onto something like UV.

[+] mbank|1 year ago|reply
I was looking for this and thought I was doing something wrong not finding anything... Great job! I do think though a "clean" development mode not needing this would be to work with a virgin virtual environment starting a new project and running pip freeze on that env.
[+] notnmeyer|1 year ago|reply
looks great. similarly this is why i love ‘go mod tidy’. just use the thing and let the package manager sort it out after the fact.
[+] skeledrew|1 year ago|reply
Anything that takes away from users fully migrating to pyproject really needs to die.
[+] mikepurvis|1 year ago|reply
But pyproject itself isn't even taking a stance on the underlying question of dependency management, which can still be flit, poetry, uv, pipx, pipenv, and who knows what else.

(I'm a poetry user myself, but I can see the writing on the wall that poetry is probably not going to be the winner of who ultimately ends up in the PEP.)

[+] hiccuphippo|1 year ago|reply
This project looks like a first step to migrate something to pyproject.
[+] andrewmcwatters|1 year ago|reply
I don't spend a lot of time in Python, but my current understanding having read Python documentation and seeing some projects online is that you use pip and requirements.txt with --require-hashes, venv, and pyproject.toml to use a standard dependency management toolchain.

Is that correct?

[+] tmslnz|1 year ago|reply
How is it different to pip-chill[1]?

[1]: https://github.com/rbanffy/pip-chill

[+] surye|1 year ago|reply
I believe pip-chill still operates on packaged installed into the environment. This project seems to derive from the code itself, even if no packages are installed in the current environment.
[+] Terretta|1 year ago|reply
Please consider consolidating python dependency management instead of fragmenting it:

https://docs.astral.sh/uv/pip/compile/

In other words, bring the thinking here. Whether it's new thinking, or decade old thinking, it's well time to converge. We've had decades of trying it like Heinz catsup varieties. Worth trying more wood behind fewer arrows.

[+] whalesalad|1 year ago|reply
This is not fragmenting it. The requirements.txt file has been a steady and well used API for well over a decade. This tool just helps you produce one for a project that is missing one.
[+] Spivak|1 year ago|reply
You're just picking a winner like every other Python dependency project. If it's not in a PEP do whatever the hell you want. Good ideas will get turned into future PEPs for tooling to standardize on. uv itself has two separate locking formats.
[+] imjonse|1 year ago|reply
One can start non-distruptively with uv by only using it as a pipx replacement at first ( uv tool). Nice and fast.
[+] ris|1 year ago|reply
How about you stop trying to create package managers for every single language/ecosystem in existence and instead converge on trying to solve the whole problem once and for all with Nix.
[+] jghn|1 year ago|reply
> Please consider consolidating python dependency management instead of fragmenting it

You realize that this project predates uv by roughly a decade, right?

[+] jonathrg|1 year ago|reply
How is uv, another flavor-of-the-month Python packaging tool, not contributing to fragmentation?
[+] Der_Einzige|1 year ago|reply
Please consider consolidating python dependency management instead of fragmenting it:

https://github.com/mamba-org/mamba

In other words, bring the thinking here. Whether it's new thinking, or decade old thinking, it's well time to converge. We've had decades of trying it like Heinz catsup varieties. Worth trying more wood behind fewer arrows.

[+] hasante|1 year ago|reply
It dosent work 100% of the time & can get you gotchas that you need to fix - From my experience with it.
[+] jcarrano|1 year ago|reply
When I first started with Python, long ago, I looked into these kind of solutions, which didn't work so well, and wondered why the concept was not better developed. Later, with experience, I realized it is not am great idea, and more hassle than the benefits it brings.

I don't think it is good idea to merrily write 10s of import statements and end up with loads of dependencies.

[+] spullara|1 year ago|reply
Wrap your python program in an LLM that just keeps installing things when it fails until it works :)
[+] mooreds|1 year ago|reply
From the GH repo description:

> Looking for maintainers to move this project forward.

So not sure how maintained it is.

[+] FlipFloopDev|1 year ago|reply
Surprised I've never seen this, despite it existing for at least 9 years??
[+] 999900000999|1 year ago|reply
Can you have it work with pipfiles too ?
[+] remram|1 year ago|reply
I had forgotten about pipfiles. Like their name implies (/s) they are not for pip but pipenv, a separate tool.

https://github.com/pypa/pipfile "This format is still under active design and development", last commit "2 years ago". I think this is dead.

[+] kstrauser|1 year ago|reply
Alternatively, can the pipenv gang support the same pyproject.toml files as everyone else?
[+] TZubiri|1 year ago|reply
let's not fuck with supply chain vulnerabilities
[+] jonathrg|1 year ago|reply
Feel free to elaborate.