(no title)
bouncing | 1 year ago
All three of those have declined. It's less readable than it used to be, it's definitely more complicated (not just complex, complicated), and the standard library is declining rapidly in relevance as it ages.
And it wasn't just Guido. Tim was a big advocate for all three of those super powers when he was more influential. They banned Tim and they censored Guido, so go figure.
zahlman|1 year ago
I find it much more readable, and more importantly more expressive. Certain new features are missteps IMO, but I just don't use them. But more importantly, the language has been moving away from cryptic %-encodings and other C idioms.
As for the standard library, that was already happening for a long time, and is inevitable. The world has fundamentally changed. In Python's heyday it was much harder to download and install and use a third-party library, so a rich standard library was an asset. Now it's full of specialized code that handles obscure and increasingly irrelevant data formats; multiple overlapping hacks for binary data; terrible and confusing date support; awkward interfaces that haven't stood the test of time (particularly all the networking stuff; Requests is one of the most downloaded PyPI packages, along with its dependencies which are probably almost never downloaded for any other reason); etc.
Lots of people still seem to think that the 2->3 migration was a mistake. They couldn't be more wrong. The old way of handling "strings" was abysmal, and spit in the face of the Zen. Error messages were confusing and implicit conversions abounded.
Also, just for the record: Guido van Rossum was in favour of the walrus operator. In fact, he co-authored the PEP (https://peps.python.org/pep-0572/), along with Tim Peters.
bouncing|1 year ago
Type hints are a sore spot for me. They're good enough when you just don't remember whether an argument is an object or a string, for example, but once you start type hinting deep into data structures, your hints become a mangled soup of nonsense for basically no real benefit. Typing errors are a rare occurrence—perhaps once a year in most projects—yet we clutter our codebases with verbosity to satisfy type checkers instead of prioritizing clarity for developers.
There's a lot that's just straight up redundant. Dicts are ordered now, but is OrderedDict deprecated? No, because it's just slightly different in weird and mostly unimportant ways. `frozenset` is a builtin, for all 3 programmers worldwide who use it. Python resisted match/case syntax for decades, but when it finally arrived, it did so in a way that’s anything but standard—good luck figuring it out without consulting the documentation.
Obviously some improvements are real. Every new version of Python brings valuable enhancements. But just go back to Python as it used to be -- pseudocode that runs. That's just not true anymore. The simplicity has slipped away and will never ever come back.
And the standard library? A very real problem, right now, in computer security is the software supply chain. Remember polyfill from like, yesterday? This is the era when we should double down on having a million dependencies from all over GitHub, from unknown developers with no commitment, because ... npm's hellscape is a model to follow?
I would argue the contrary. There's dependency hell, of course, but there's also dependency risk. If you were evaluating a product right now, and you saw its lockfile depended only on a specific version of the Python Standard Library, that gives you exactly 1 product to evaluate, exactly 1 team of developers to depend on. pip is great and all, but dependency resolvers have quietly let in a hundred trojan horses and a thousand unmaintained dependencies into tons of projects, and no one noticed it was even happening.
Python in 2005, when everyone depended on the standard library, was a safer place than npm is today.