top | item 45752272

(no title)

hardwaregeek | 4 months ago

I gotta say, I feel pretty vindicated after hearing for years how Python’s tooling was just fine and you should just use virtualenv with pip and how JS must be worse, that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing

discuss

order

caconym_|4 months ago

There is nothing I dread more within the general context of software development, broadly, than trying to run other people's Python projects. Nothing. It's shocking that it has been so bad for so long.

hardwaregeek|4 months ago

Never underestimate cultural momentum I guess. NBA players shot long 2 pointers for decades before people realized 3 > 2. Doctors refused to wash their hands before doing procedures. There’s so many things that seem obvious in retrospect but took a long time to become accepted

tomaskafka|4 months ago

So many times I have come onto a library or tool that would fix my problem, and then realized “oh crap, it’s in Python, I don’t want to spend few hours building a brittle environment for it only for that env to break next time I need to use it” - and went to look for a worse solution in better language.

Multicomp|4 months ago

I agree with you wholeheartedly, besides not preferring dynamic programming languages, I would in the past have given python more of a look because of its low barrier to entry...but I have been repulsed by how horrific the development ux story has been and how incredibly painful it is to then distribute the code in a portable ish way.

UV is making me give python a chance for the first time since 2015s renpy project I did for fun.

zelphirkalt|4 months ago

That's because many people don't pay attention to reproducibility of their developed software. If there is no lock file in a repo that nails the exact versions and checksums, then I already know it's likely gonna be a pain. That's shoddy work of course, but that doesn't stop people from not paying attention to reproducibility.

One could argue, that this is one difference between npm and such, and what many people use in the Python ecosystem. npm and cargo and so on are automatically creating lock files. Even people, who don't understand why that is important, might commit them to their repositories, while in the Python ecosystem people who don't understand it, think that committing a requirements.txt only (without checksums) is OK.

However, it is wrong, to claim, that in the Python ecosystem we didn't have the tools to do it right. We did have them, and that well before uv. It took a more care though, which is apparently too much for many people already.

acomjean|4 months ago

You aren’t kidding. Especially if it’s some bioinformatics software that is just hanging out there on GitHub older than a year…

lacker|4 months ago

The only thing I dreaded more was trying to run other people's C++ projects.

lynndotpy|4 months ago

I was into Python enough that I put it into my username but this is also my experience. I have had quasi-nightmares about just the bog of installing a Python project.

intalentive|4 months ago

I used to think this sentiment was exaggerated. Then I tried installing Dots OCR. What a nightmare, especially when NVIDIA drivers are involved.

the__alchemist|4 months ago

Same! And Python was my first, and is currently my second-highest-skill language. If someone's software's installation involves Python, I move on without trying. It used to be that it would require a Python 2 interpreter.

Honorable mention: Compiling someone else's C code. Come on; C compiles to a binary; don't make the user compile.

luckydata|4 months ago

The python community was in profound denial for a very long time.

zippergz|4 months ago

I dread running my own Python projects if I haven't worked with them in a while.

jkercher|4 months ago

Couldn't agree more. I have a project at work from 2016 that builds multiple different HMIs (C++) along with 2 embedded systems (C). They all have to play nicely with each other as they share some structures and can all be updated in the field with a single file on a USB stick. So there is a bash script that builds everything from a fresh clone, makes update files, and some other niceties. Then, there is a single python script that generates a handful of tables from a json file.

Guess which part of the build I spent fixing the other day... It wasn't the ~200000 lines of c/c++ or the 1000+ line bash script. No. It was 100 lines of python that was last touched 2 years years ago. Python really doesn't work as a scripting language.

TheCondor|4 months ago

How about shipping one? Like even just shipping some tools to internal users is a pain

kristopolous|4 months ago

I really don't understand this. I find it really easy.

LtWorf|4 months ago

Just stick to what's in your linux distribution and you've got no problems.

dataflow|4 months ago

Not even trying to compile build other people's C/C++ projects on *nix?

RobertoG|4 months ago

pfff... "other people projects".. I was not even able to run my own projects until I started using Conda.

mk89|4 months ago

I have used

pip freeze > requirements.txt

pip install -r requirements.txt

Way before "official" lockfile existed.

Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.

Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.

icedchai|4 months ago

That works for simple cases. Now, update a transitive dependency used by more than one dependency. You might get lucky and it'll just work.

kstrauser|4 months ago

That works, more or less. But now you have a requirements.txt file with 300 dependencies. Which ones do you actually care about, and which are just transitive things that your top-level deps brought along for the ride? And a year later, when GitHub's Dependabot is telling you have a security vulnerability in some package you've never heard of, do you remember if you even care about that package in the first place, or if it's left over cruft from that time you experimented with aiohttp instead of httpx?

rtpg|4 months ago

As a “pip is mostly fine” person, we would direct the result to a new lock file, so you could still have your direct does and then pin transitives and update

Pips solver could still cause problems in general on changes.

UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.

pnt12|4 months ago

This is way less than what uv and other package managers do:

- dev dependencies (or other groups) - distinguishing between direct and indirect dependencies (useful if you want to cut some fat from a project) - dependencies with optional extra dependencies (if you remove the main, it will delete the orphans when relevant)

It's not unachievable with pip and virtualenvs, but verbose and prone to human error.

Like C: if you're careful enough, it can be memory safe. But teams would rather rely on memory safe languages.

2wrist|4 months ago

It is also manages the runtime, so you can pin a specific runtime to a project. It is very useful and worth investigating.

tecoholic|4 months ago

I am on the same boat. I like uv for its speed and other niceties it brings and being a single tool to manage different things. But lockfile is not that big a deal. I never got Poetry as well. Tried it in a project once and the lockfile was a pain with the merges. I didn’t spend much time, so maybe I didn’t understand the tool and workflow or whatever, but pip and pip-tools were just fine working with requirements.txt.

selcuka|4 months ago

The canonical way to do this with pip was using Constraints Files [1]. When you pollute your main requirements.txt it gets harder to see which package is an actual dependency of your project, and which ones are just sub-dependencies. Constraint files also let you not install a package if it's no longer a sub-dependency.

That being said, the uv experience is much nicer (also insanely fast).

[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files

12345hn6789|4 months ago

Oops, you forgot to sh into you venv and now your env is messed up.

ghusto|4 months ago

I've never even understood the virtual env dogma. I can see how version conflicts _could_ happen, but they never have. Admittedly, I'm surprised I never have issues installing globally, especially since others keep telling me what a terrible idea it is and how they had nightmare-scenario-X happen to them.

FuckButtons|4 months ago

Honestly, this feels like the difference between Cmake and cargo, sure Cmake does work and you can get to do everything you need, you just need discipline, knowledge and patience. On the other hand, you could just have a tool that does it all for you so you can get back to doing the actual work.

avidphantasm|4 months ago

I don’t get the hype either. Every time I’ve tried to use tools like pyenv or pipenv they fall down when I try to install anything that doesn’t provide wheels (GDAL), so I give up and stick to pip and virtualenv. Does uv let me install GDAL without hassle?

ifwinterco|4 months ago

It is indeed fairly simple to implement it, which is why it's so weird that it's never been implemented at a language level

epage|4 months ago

Good luck if you need cross-platform `requirements.txt` files.

chrisweekly|4 months ago

Webdev since 1998 here. Tabling the python vs JS/etc to comment on npm per se. PNPM is better than npm in every way. Strongest possible recommendation to use it instead of npm; it's faster, more efficient, safer, and more deterministic. See https://pnpm.io/motivation

Ant59|4 months ago

I've gone all-in on Bun for many of the same reasons. Blazingly fast installs too.

https://bun.sh/

tracker1|4 months ago

Deno is pretty sweet too... shell scripts that don't need a package.json or a node_modules directory for dependencies.

nullbyte|4 months ago

I find pnpm annoying to type, that's why I don't use it

anp|4 months ago

Might be worth noting that npm didn’t have lock files for quite a long time, which is the era during which I formed my mental model of npm hell. The popularity of yarn (again importing bundled/cargo-isms) seems like maybe the main reason npm isn’t as bad as it used to be.

no_wizard|4 months ago

npm has evolved, slowly, but evolved, thanks to yarn and pnpm.

It even has some (I feel somewhat rudimentary) support for workspaces and isolated installs (what pnpm does)

WatchDog|4 months ago

Lock files are only needed because of version ranging.

Maven worked fine without semantic versioning and lock files.

Edit: Changed "semantic versioning" to "version ranging"

globular-toast|4 months ago

I've been using pip-tools for the best part of a decade. uv isn't the first time we got lock files. The main difference with uv is how it abstracts away the virtualenv and you run everything using `uv run` instead, like cargo. But you can still activate the virtualenv if you want. At that point the only difference is it's faster.

jrochkind1|4 months ago

Yeah, python's tooling for dependency management was definitely not just fine, it was a disaster.

Coming from ruby. However, I think uv has actually now surpassed bundler and the ruby standard toolset for these things. Definitely surpassed npm, which is also not fine. Couldn't speak for cargo.

icedchai|4 months ago

poetry gave us lock files and consistent installs for years. uv is much, much faster however.

beeb|4 months ago

I used poetry professionally for a couple of years and hit so many bugs, it was definitely not a smooth experience. Granted that was probably 3-4 years ago.

no_wizard|4 months ago

There was pipenv before that too, which also had a lockfile.

Funny how these things get forgotten to history. There's lots of prior art when it comes to replacing pip.

edit: here's an HN thread about pipenv, where many say the same things about it as they are about UV and Poetry before https://news.ycombinator.com/item?id=16302570

rcleveng|4 months ago

and pip-compile before that.

Agree that uv is way way way faster than any of that and really just a joy to use in the simplicity

ShakataGaNai|4 months ago

I have to agree that there were a lot of good options, but uv's speed is what sets it apart.

Also the ability to have a single script with deps using TOML in the headers super eaisly.

Also Also the ability to use a random python tool in effectively seconds with no faffing about.

ForHackernews|4 months ago

To be fair, Poetry has done everything uv does for about a decade. uv is much faster, which is great, but lock files, integrated venv management, etc.

silverwind|4 months ago

Yep, coming from poetry, uv is a pure speed increase with the same feature set.

odyssey7|4 months ago

Python might have been better at this but the community was struggling with the 2 vs 3 rift for years. Maybe new tooling will change it, but my personal opinion is that python does not scale very well beyond a homework assignment. That is its sweet spot: student-sized projects.

morshu9001|4 months ago

Imo the community should've rejected Python 3 and said, find a way to improve things without breaking everyone. JS managed to do it.

Me001|4 months ago

[deleted]

zelphirkalt|4 months ago

Tooling like npm, cargo, and others existed well before uv came up. I have used poetry years ago, and have had reproducible virtual environments for a long time. It's not like uv, at least in that regard, adds much. The biggest benefit I see so far, and that is also why I use it over poetry, is that it is fast. But the benefit of that is small, since usually one does not change the dependencies of a project that often, and when one does, one can also wait a few seconds longer.

RatchetWerks|4 months ago

I’ve been saying this for years! JS gets alot of hate for dependency hell.

Why?

It’s almost too easy to add one compared to writing your own functions.

Now compare that to adding a dependency to a c++ project

gigatexal|4 months ago

the thing is I never had issues with virtual environments -- uv just allows me to easily determine what version of python that venv uses.

j2kun|4 months ago

you mean you can't just do `venv/bin/python --version`?

mbac32768|4 months ago

> that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing

I think it's more like Rust devs using Python and thinking what the fuck why isn't this more like rustup+cargo?

brightball|4 months ago

I tried Python for the first time after I’d been coding with multiple other languages for about 15 years.

The environment, dependency experience created so much friction compared to everything else. Changed my perspective on Docker for local dev.

Glad to hear it seems to finally be fixed.

doright|4 months ago

Why did it take this long? Why did so many prior solutions ultimately fall flat after years and years of attempts? Was Python package/environment management such a hard problem that only VC money could have fixed it?

morshu9001|4 months ago

It's not fixed quite yet because the default recommended way is still pip. And that's the same reason past attempts didn't work.

stavros|4 months ago

It didn't, though? Poetry was largely fine, it's just that uv is so much faster. I don't think uv is that much different from Poetry in the day-to-day dependency management, I'm sure there are some slight differences, but Poetry also brought all the modern stuff we expected out of a package manager.

Spivak|4 months ago

But you are just using virtualenv with pip. It doesn't change any of the moving pieces except that uv is virtualenv aware and will set up / use them transparently.

You've been able to have the exact same setup forever with pyenv and pyenv-virtualenv except with these nothing ever has to be prefixed. Look, uv is amazing and I would recommend it over everything else but Python devs have had this flow forever.

dragonwriter|4 months ago

> But you are just using virtualenv with pip.

No, you aren't.

> It doesn't change any of the moving pieces

It literally does, though iyt maintains a mostly-parallel low-level interface, the implementation is replaced with improved (in speed, in dependency solving, and in other areas.) You are using virtual environments (but not venv/virtualenv) and the same sources that pip uses (but not pip).

> You've been able to have the exact same setup forever with pyenv and pyenv-virtualenv except with these nothing ever has to be prefixed.

Yes, you can do a subset of what uv does with those without prefixes, and if you add pipx and hatch (though with hatch you’ll be prefixing for much the same reason as in uv) you’ll get closer to uv’s functionality.

> Look, uv is amazing and I would recommend it over everything else but Python devs have had this flow forever.

If you ignore the parts of the flow built around modern Python packaging standards like pyproject.toml, sure, pieces of the flow have been around and supported by the right constellation of other standard and nonstandard tools for a while.

tiltowait|4 months ago

I don't know, Poetry's existed for years, and people still use requirements.txt. Uv is great but isn't exactly unique in Python-land.

wraptile|4 months ago

Yeah I use poetry, uv and requirements.txt - all great tools for their respective niches.

temporallobe|4 months ago

Yep, working with bundler and npm for a decade plus has made me appreciate these tools more than you can know. I had just recently moved to Python for a project and was delighted to learn that Python had something similar, and indeed uv is more than just a package manager like bundler. It’s like bundler + rvenv/rvm.

And inspired by uv, we now have rv for RoR!

nateglims|4 months ago

Personally I never thought it was fine, but the solutions were all bad in some way that made direct venv and requirements files preferable. Poetry started to break this but I had issues with it. uv is the first one that actually feels good.

j45|4 months ago

I feel a little like this too.

My default feeling towards using python in more ways than I did was default no because the tooling wasn't there for others to handle it, no matter how easy it was for me.

I feel uv will help python go even more mainstream.

tyingq|4 months ago

> but lock files and consistent installs are amazing

Yes, though poetry has lock files, and it didn't create the same positive feelings uv does :)

zamalek|4 months ago

I would dread cloning a python project more than I would C++, and was the sole purpose I made real effort to avoid the language entirely.

zellyn|4 months ago

What weird shadow-universe do you inhabit where you found Python developers telling you the tooling was just fine? I thought everyone has agreed packaging was a trash fire since the turn of the century.

morshu9001|4 months ago

Hackernews and also the official Python maintainers

pydry|4 months ago

>finally get a taste of npm

good god no thank you.

>cargo

more like it.

internetter|4 months ago

cargo is better than npm, yes, but npm is better than pip (in my experience)

insane_dreamer|4 months ago

other than being much slower than uv, conda has worked great for years

I do prefer uv but it's not like sane python env management hasn't existed

ThinkBeat|4 months ago

there are severe problems with npm as well. It is not a model I hope is replicated.

NaomiLehman|4 months ago

conda was great to me

bastawhiz|4 months ago

conda ruined my shell and never successfully worked for me. I guess YMMV

insane_dreamer|4 months ago

same here; I now prefer uv but conda served us very well, and allowed us to maintain stable reproducible environments; being able to have multiple environments for a given project is also sometimes handy vs a single pyproject.toml

WesolyKubeczek|4 months ago

I somehow had quite enough problems going from bundler 1.13 to 1.16 to 2.x some years ago. I’m glad we have killed that codebase with fire.

Me001|4 months ago

[deleted]

kevin_thibedeau|4 months ago

> you should just use virtualenv with pip

This is the most insulting take in the ongoing ruination of Python. You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell. Now you get endlessly chastised for trying to use Python as a general purpose utility. Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.

ElectricalUnion|4 months ago

Unless all python dependencies you ever used were available in your distro (and then at that point, you're no longer using pip, you're using dpkg...), this never worked well. What solves this well is PEP 723 and tooling around it.

With PEP 723 and confortable tooling (like uv), now you get scripts, that are "actually directly runnable", not just "fake directly runnable oops forgot to apt-get install something sorta runnable", and work reliably even when stuff around you is updated.

zahlman|4 months ago

> You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell.

This wasn't really the case; in principle anything you installed in the system Python environment, even "at user level", had the potential to pollute that environment and thus interfere with system tools written in Python. And if you did install it at system level, that became files within the environment your system package manager is managing, that it doesn't know how to deal with, because they didn't come from a system package.

But it's worse now because of how many system tools are written in Python — i.e., a mark of Python's success.

Notably, these tools commonly include the system package manager itself. Since you mentioned Debian (actually this is Mint, but ya know):

  $ file `which apt`
  /usr/local/bin/apt: Python script, ASCII text executable
> Now you get endlessly chastised for trying to use Python as a general purpose utility.

No, you don't. Nothing prevents you from running scripts with the system Python that make use of system-provided libraries (including ones that you install later with the system package manager).

If you need something that isn't packaged by your distro, then of course you shouldn't expect your distro to be able to help with it, and of course you should expect to use an environment isolated from the distro's environment. In Python, virtual environments are the method of isolation. All reasonable tooling uses them, including uv.

> Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.

It's not "ruined". If you choose to install the system package for pip and to use it with --break-system-packages, the consequences are on you, but you get the legacy behaviour back. And the system packages still put files separately in dist-packages. It's just that... doing this doesn't actually solve all the problems, fundamentally because of how the Python import system works.

whywhywhywhy|4 months ago

> Python as a general purpose utility

This ideology is what caused all the problems to begin with, the base python is built as if it's the only thing in the entire operating systems environment when it's entire packaging system is also built in a way that makes that impossible to do without manually having to juggle package conflicts/incompatibilities.

whalesalad|4 months ago

it's because so many essential system tools now rely on python, and if you install arbitrary code outside of a venv it can clobber the global namespace and break the core OS' guarantees.

I do agree it is annoying, and what they need to do is just provide an automatic "userspace" virtualenv for anything a user installs themselves... but that is a pandoras box tbh. (Do you do it per user? How does the user become aware of this?)

1718627440|4 months ago

This is very true! I was highly surprised when I installed Python from source and found out, that the entire problem is fixed since decades. You can have different Python versions in the same prefix just fine, you just need to pick a default one you install with `make install` and install all the others with `make altinstall`.