(no title)
TheFlyingFish | 3 months ago
I don't think languages should try to include _everything_ in their stdlib, and indeed trying to do so tends to result in a lot of legacy cruft clogging up the stdlib. But I think there's a sweet spot between having a _very narrow_ stdlib and having to depend on 160 different 3rd-party packages just to make a HTTP request, and having a stdlib with 10 different ways of doing everything because it took a bunch of tries to get it right. (cf. PHP and hacks like `mysql_real_escape_string`, for example.)
Maybe Python also has a historical advantage here. Since the Internet was still pretty nascent when Python got its start, it wasn't the default solution any time you needed a bit of code to solve a well-known problem (I imagine, at least; I was barely alive at that point). So Python could afford to wait and see what would actually make good additions to the stdlib before implementing them.
Compare to Rust which _immediately_ had to run gauntles like "what to do about async", with thousands of people clamoring for a solution _right now_ because they wanted to do async Rust. I can definitely sympathize with Rust's leadership wanted to do the absolute minimum required for async support while they waited for the paradigm to stabilize. And even so, they still get a lot of flak for the design being rushed, e.g. with `Pin`.
So it's obviously a difficult balance to strike, and maybe the solution isn't as simple as "do more in the stdlib". But I'd be curious to see it tried, at least.
Chris_Newton|3 months ago
At the top, you have the true standard library for the language. This has very strong stability guarantees. Its purpose is twofold: to provide universal implementations of essentials and to define standard/baseline interfaces for common needs like abstract data types, relational databases, networking and filesystems to encourage compatibility and portability.
Next, you have a tier of recognised but not yet fully standardised libraries. These might be contributed by third parties, but they have requirements for identifying maintainers, appropriate licensing and mandatory peer review of all contributions. They have a clear versioning policy and can make breaking changes in new major releases, but they also provide some stability guarantees along the lines of semver and older releases are normally available indefinitely. The purpose of this tier is to provide a wider range of functionality and/or alternative implementations, but in a relatively stable way and implementing standard interfaces where applicable to improve portability.
Finally, you have the free-for-all, anyone-can-contribute tier. This should still have a sane security model where people can’t just upload malware scripts that run automatically just because someone installed a package. However, it comes with few guarantees about stability or compatibility, except that releases of published packages will be available indefinitely unless there’s a very good reason to pull them where you obviously wouldn’t want to use one anyway. A package you like might be written by a single contributor who no longer maintains it, but if someone does write something useful that simply doesn’t need any further maintenance once it’s finished and does its job, there is still a place to share it.
gr4vityWall|3 months ago
Debian also has something 'in the middle' with additional repositories that aren't part of the main distribution and/or contain proprietary software.
x0x0|3 months ago
afdbcreid|3 months ago
auxiliarymoose|3 months ago
That Rust does not have standard implementations of commonly-used features (such as an async runtime) is problematic for supply chain security, since then everyone is pulling in dozens (or hundreds) of fragmented 3rd-party packages instead of working with a bulletproof standard library.
ghurtado|3 months ago
PHP is a fantastic resource to learn how to do proper backward compatibility and package management. By doing the exact opposite of whatever PHP does, mostly.