I wish people could be honest and say they don't care for some language or framework or OS for personal or aesthetic reasons, rather than having to round it up to being objectively bad, but then I suppose nobody probably would click on "I don't like Python and have got some nits to pick".
Oh and he just says what is supposed to be quiet part at the end:
>And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living. I know it sounds harsh and I apologize for that, but writing software is a profession and hence you should have the knowledge, skill, and experience to use professional tools.
Hear that all data scientists, flask devs, systems engineers, and ML folks? Python is bad so you should quit. ;)
I see this sort of person as an extremely rigid, unbending, dime a dozen type person even if they’re very intelligent. I’m in this business to succeed and build and create things, there is very much an “energetic” aspect and his energy is dead as fuck. It’s a very simple fact that some see it, and some don’t. He’s one who doesn’t, and all those that see it can see it so clearly. He’s most definitely not someone I would want on my team.
The amount of idiotic … implications of his statement is so excruciating it’s physically painful to me. But I encounter this unadaptive unawareness all the time with working with programmers from other teams, etc so I’m used to it.
> I wish people could be honest and say they don’t care for some language or framework or OS for personal or aesthetic reasons, rather than having to round it up to being objectively bad, but then I suppose nobody probably would click on “I don’t like Python and have got some nits to pick”.
Yeah, this kind of hyperbolic headline article repeating mostly dead-horse arguments is just low-effort click farming, not a serious contribution to the discussion of anything. All it adds to what is a well-trodden debate is…factual errors, like the claim that Python prominently features “lazy evaluation” (while, as in any language, you can arrange laziness in Python, it is very much eager normally.)
I really think that Python is not a good language for ML, it just got "there" first.
The ecosystem is the real plus, of course. But the language is a headache for this. I agree with the "false economy" angle. I would happily trade the "agility" of dynamic "gluing" with some kind of real type safety, human-readable error messages and performance[0].
[0] - hiding C in Python's clothes doesn't count :)
> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living. I know it sounds harsh and I apologize for that, but writing software is a profession and hence you should have the knowledge, skill, and experience to use professional tools.
Part of the profession of software engineering is maintaining software that's already written. Should the people who maintain python code, not be paid for their work?
Another part is choosing the right tool for the job. Python has its flaws, but it is better than Go in some ways. For example, it has a richer ecosystem of libraries.
Why hasn’t the go community, of professional software engineers built an even richer ecosystem of libraries?
Is it ennui, incompetence, or attitude?
As Go came from Google, is that the attitude was, “I am a professional I’ll just write my own code to solve X”, rather than considering building a library that others can use?
Are libraries harder to build in Go? Is adoption of libraries by the Go community different?
Is it a mindset, that libraries are uninteresting?
>Besides, linters and type hinting have come a long way.
You acknowledge that the kinds of static analysis that are feasible in Python are valuable, but it's "terrible" to want the kinds of static analysis that are infeasible. How interesting that the two boundaries line up exactly.
Still incredibly horrible for many cases. And very slow compared to things that have been around for decades. Do some ocaml and you see how incredibly bad everything 'linted' is. You can compile millions of lines in seconds in ocaml, eh delphi on a 90s computer, jai (I don't find Blow sympathetic but he does point out how lame everything is and that's good) etc, but when my 50k python or typescript project starts linting when doing a yarn build, I have time for a good workout.
Ocaml is an interesting example imho, as the type inference engine is so good you hardly have to specify types. When you read it and squint, it looks dynamic. It's not.
There's an evaporative cooling effect. Ten years ago it was obvious that Python was going to have a very hard time in the multicore world. People who needed that performance and knew they needed that performance left somewhere in the intervening years. It has been obvious for a while that Python was not going to be capable of a general solution to that problem no matter what it did. Now those people are no longer in the Python community.
What's left are people who don't need that performance, which is sometimes me and is when I still am happy to use Python for something, and people who do need that performance, but don't know it. Those are the ones who get into trouble.
I do wish that the Python developer community would be more open about the fact that Python performance is really quite bad in many cases and that while there are some tools to peck around the edges, it is still fundamentally a low performance language, things like NumPy notwithstanding (it is, ultimately, still a "peck around the edges" tool, even if the particular edge it pecks it does extremely well, but that only makes the performance delta worse when you fall out of its accelerated code path). I feel like maybe back in the 200Xs the community as a whole was more open to that. Now "Python is slow" is perceived by the community as an attack. Maybe because the ones who did understand what that issue was are mostly gone.
But in the 2020s, yes, Python ought to be straight-up eliminated from many tasks people want to do today on straight-up performance grounds. Overcoming a ~40x disadvantage in the single core era was sometimes tough, but often doable. But nowadays basically that gets multiplied by each core on the system, and overcoming a multi-hundred-factor disadvantage is just not worth your time IF that is a level of performance you need. Python is nice, but not nice enough to pay for multiple-hundred-factor performance penalties. Static languages have come a long way since 1995.
Python is one of my least favorite languages and I avoid it wherever I can. I agree with several of the criticisms here, but I disagree with this part:
> The problem with Python is of course that it is an interpreted language with lazy evaluation
That isn't "the problem" with Python. There's nothing wrong with these sorts of languages. However, it does limit the sorts of problems the language is suited for, and I do see places where Python was used where another language would have produced far better results.
Perhaps using Python inappropriately leads to some thinking that the fault is with the language?
I know it has functions that are lazy, but it's not lazy as in a sense that Haskell is right? I never use it as I find it a ghastly horror show (my taste, other people like it, that's fine), but I had to use it a few times and found that (also from the article) some parts are lazy, but not python as a language. Is that not correct?
> it does limit the sorts of problems the language is suited for,
Interpreted (...) is the implementation, there is no reason why it should be interpreted.
Yes, pretty much agree with this word for word. It is very, very difficult to refactor a python application in any sort of reliable way. The standard way of error handling in python appears to be to present the user with a stack trace. Very user friendly (not!). Now people will say that, for instance, mypy can help with this. That is true but since projects can be started without type checking chances are that your project was started without type checking and that introducing mypy is somewhere on the backlog and when it comes off the backlog it will be enabled only partially because otherwise there will be too many errors and so on. It is such a garbage programming environment.
I couldn’t (and still can’t) believe that non-exceptional situations are considered exceptional. Such as, there not being a way to parse a string into a number and return a value indicating if it was successful or not. With Python everything is an exception. Messy and inelegant. Of course this mess is called “pythonic” so everything is fine…
A few weeks ago I was working on a small wildfire smoke and fire perimeter API, and I hit a few annoying snags due to tooling issues. I needed to process a lot of different formats of layered geographic datasets, and converting one thing to another, processing the data into various buckets, cleaning, aggregating, etc. all wound up being extremely cumbersome and verbose.
I write a lot of Go and I’m used to that. But when I hit snags in the less familiar territory of geographic data processing, it was a slog. Terrible documentation for the libraries was a major barrier, and otherwise it seemed as though essential features simply didn’t exist and I’d have to invent them.
I got the idea to explore Python for the project because people use it for data processing. I’ve used it in the past, though never for this. Whatever, I thought, at the very least I can validate that Go is a suitable tool.
Within a day I had rebuilt everything with Python. I built a flask app around the forecasting and fire perimeter tools, and had it deployed the same evening. It was mind blowing.
As an ecosystem I was absolutely blown away by Python. Do I like the language? Not really. I encountered so many cases where something could be so much faster and more efficient with Go. Deployment would be easier. I’d get more API from the same resources. Scaling would be ten times easier. Static typing tools kept blowing up my IDE because this library doesn’t support this, or the type tool is wrong about that. It was very janky at times.
Yet Python got it done. It’s live right now, and development is steady. Go was not steady and I didn’t see any solution to that in sight without reinventing countless wheels.
By the way, another nice thing about python is that you can force your program to drop into a debugger with an interactive shell at runtime- and inspect/print objects. In today's complex world of generic types, it can often be hard to see exactly which method is handling your data.
My favourite things about Python are the huge community and the rapid iteration. Don't got me wrong, I like Go, too. It was the primary language I worked in at my previous job. But sometimes you just need the big community and the huge pool of documentation, or you want something to work now instead of fiddling with it, and Python is great for that feedback loop.
My experience with Python can be summed up as: it's tempting to start something with it, since it has such low (initial) friction and hey, "this is just a small throwaway project anyway".
Months or years later, it's a beast, hard to understand or refactor, full of holes and pitfalls, and Python's terrible tooling doesn't help either.
The author of the post seems like an evangelist for the Go programming language.
> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living.
What's funny is that (in a different Python rant from another author), it was pointed out that Google was the heaviest pusher of Python in the early 2000s. It probably would have been Java (from Android and elsewhere) had it not been all the legal stuff with Oracle brewing. So, Python here, Python there, Python everywhere... then Google invented Go and Dart and other shiny new toys and began pushing them everywhere.
> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living.
Come on man. There's being opionated, and then there's this.
I’m not a fan of Python, but this article is insufferable in tone. If Python doesn’t meet your needs, don’t use it. There’s plenty of oddities to gripe about in Python, but this article doesn’t attempt to learn something or make a larger point beyond “my favored approach is the only valid approach”. Sorry, but that’s not an opinion worth considering.
As much as I personally dislike python, I completely agree with this, but want to flip it. If pythong meets your needs, absolutely use it. Don't complain about how Visual BASIC doesn't work for you on your way to downloading whatever version of python is current this month.
No enforced static typing and no proper debugger make Python painful for large code bases. It’s good for scripts, prototyping, and gluing libraries together to make utilities, but if something expands beyond a single file I stop wanting to use Python. Convincing my employer of this is another matter and why I would rather avoid it completely.
And now, of course, it's deprecated because you wrote it in a version of Python that's no longer in support or used a library that hasn't been (or maybe won't be) ported to 3.11.
> I once worked with a service in Python that forked worker processes to handle requests, ensuring that all cores could be used.
> > Unfortunately these workers ran out of memory quickly so we decided to have workers terminate themselves after a configurable number of requests had been handled so that Linux would do our memory management for us.
I once worked with a service in Python that was essentially an image server for a special biomedical image format. The decoder was a proprietary C library written by a vendor and, unsurprisingly, had tons of memory leak issues. This exact feature of Gunicorn [0] saved us months of going back and forth with the vendor to get the library to stop killing our servers.
Python has it's flaws, but so does anything touched by humans.
Really for all the complaints about JS/NPM/Electron it looks absolutely genius next to Python tooling and PyInstall.
It's extremely frustrating that you're forced into using it to access technology that doesn't even use Python really it's just the composing glue sticking the native C++ or GPGPU code blobs together.
> Using Python for a large application is like building a nuclear reactor using Lego bricks
I like this comparison. Anyway, it's interesting to note that it took the author "many years of experience running large applications written in Python" to come to his conclusions. The advantages of static typing and the disadvantages of dynamic or duck typing are well known since decades. The problem is less Python as a language, but the fly-by-night decisions to just use it for anything. To stick with the example: what prevents people from using "Lego bricks" (or a high-temperature proof version thereof) to build a reactor? Sound engineering decisions and, most importantly, safety regulations.
File this one under "things that UNIX systems programmers think will work in principle but end up being massive black-holes of attempting to quiesce any non-trivial application in a way that results in a sensible post-fork state".
Python is a horrible language, but not for the reasons the author gives. Just because range() returns a generator doesn't mean the whole language is lazy. Several Lisps allow something like duck typing and they're not horrible. It is possible to reason about program behaviour in dynamic languages, but JavaScript certainly makes it hard.
Python is a horrible language because it is not a language. It is a family of languages that changes every October. Sure, 3.x doesn't introduce as many backwards-incompatible changes per version as 2.x did, but that's like saying World War I wasn't horrible because towards the end fewer people seemed to be dying every week.
I've been programming in Python for most of the past 10 years, and I've never experienced a regression in backwards-compatibility between minor releases. What problems have you had?
Until it introduced the haphazard type system. Now I need to import types in every file, use IF to guard it in CI in every file, and use a powerful IDE to be able to use the benefits of typing.
You don't need to do anything, you can ignore all type hints
> use IF to guard it in CI in every file
Are you talking about "if TYPE_CHECKING:"?
Your other option is to put "from __future__ import annotations" at the top of the file, or wait for Python 3.13 when PEP 649 lands and type annotations become lazily evaluated.
typing brings me no happiness either, because it's a lot of work without being complete anyway. Annotations that are not checked vs actual behaviour at runtime can always, by the laws of programming, be subtly incorrect.
I still use python. The recently introduced match statement is a great addition, IMO.
Can someone explain this part to me, please?
I don't follow what's going on.
> Python's use of reference counting defeated copy-on-write because even memory blocks holding variables that were read-only were actually written to in order to manipulate the reference counts, thereby blowing up the combined physical memory footprint of the workers. We solved this by smurfing the interpreter to use a magic reference count number for all variables that were created by the master process and inherited by the workers, and then not touching reference counts that had the magic value.
You have a program that for whatever reason (the Python runtime in this case) only works single-threaded, although its workload could be easily parallelized (say, it’s a web server where requests are processed independently). An old established way to accomplish this is to start a “master” process which forks N “worker” processes, each of which can happily run single-threaded.
This would be a nonstarter if it required N+1 times the memory of the single process, so the OS uses an optimization called copy-on-write. When a process forks, all its physical memory is shared by the new process so it takes almost no new memory to start. If the new process writes to a memory page, that physical page is copied so it has its own version. (Thus “copy on write”.)
For most programs this works fine, but if you have a runtime that does garbage collection using a technique that requires writing to an object even if the code doesn’t change any of its values, trouble ensues. With reference counting, you have to write a new reference count for an object anytime a pointer to the object is assigned. If you store the reference count in the object, that means its physical page has to be copied. So now the CoW optimization totally doesn’t work, because just referencing an object causes it to take up additional new memory.
Ruby used to have this same problem, and after Ruby webservers became popular (hello Rails) they eventually incorporated a patch to move the GC information somewhere outside the actual object heap. Other systems like the JVM use similar techniques to store the bookkeeping bits somewhere other than the object field bits.
So what the OP did is patch the runtime so the objects created in the master process (pre-forking) have special reference counts that are never altered. This mostly works, because the master process generally does a bunch of setup so its objects were mostly not going to be garbage anyway.
I don't understand the "smurfing" solution he references, but CPython's runtime uses reference counts in each referenced value to detect garbage (when a value can be freed), which means even read-only values can be modified in memory by the runtime as object references come and go.
Those modifications force pages which were created on forking a child process as copy-on-write (meaning they share the same physical page until the page is modified by the child) to be copied and thus blow out any memory savings that would normally happen with copy-on-write.
> because the value of a good programming language is that it will not allow you to write programs that are structurally deficient.
Ummm... okay.
I'm not going to cheerlead for Python here (in fact I do not like it at all and also avoid it whenever possible) but many of this author's points seem to boil down down to "screwdrivers are bad, here's why you should always use hammers instead".
Python is a gravity pool attracting unexperienced programmers. And very often (in my experience) , it shows.
Lack of static typing is nothing in comparison with lack of common sense and unwillingness to learn ("why force oneself ? The job market swallows everything anyway") .
Well said!
Another way of putting it: Python isn't a production-ready language, due to the way people are using it.
Whenever some project I find doesn't just work, or works and then a few weeks later stops working, it always seems to be python. People cannot semver properly (even python itself? or is that some programs work on 3.9 and not 3.10 the programmers fault again?) and also cannot lock their dependency versions properly. Same problem that can happen to nodejs code, and yet, I rarely see such basic failures there.
I also just don't understand why anyone would even want to use python, anyway.
I've tried to debug python code, or write new python code, but I could never get into it, nor was it easy to read code others had written. Whitespace sensitivity? Special __ files for package purposes? No JIT (no idea why pypy is a separate thing rather than rolled in)? I just don't see the advantage over JS which seems to do everything better. It even has the benefit of being syntactically intuitive for users of other languages, and typescript really has saved JS from chaos. It's fine to be different from C syntactically, but it I don't see the benefit of python to justify that syntax taking up another spot in my brain I could use for learning rust or something.
> Lack of static typing is nothing in comparison with lack of common sense and unwillingness to learn
I have never found any language community to lack that, and if anything its more where people have lots of experience exclusively in one language than with inexperienced programmers picking up their first (who tend to, by nature, have a willingness to learn, even if they have a lack of the common base of experience that gets labelled “common sense”.)
Everything is terrible if you use it long enough. Some things are more terrible than others for certain use cases - a thoughtful developer understands the weak points of the tooling, and selects the proper tool for the job at hand.
Problem with python, it introduced a whole bunch of people into the programming world who have the minimum idea about programming or computer (mostly data scientists), just like how computer science introduced a whole bunch of people into “engineering” world even though they know little about computers at a low level, let alone at an OS level, that’s why several SaaS like heroku tried to close that gap because those “engineers” can’t into an OS!
It's the same old, tired rant about how Python's terrible.
The post is basically a surgeon complaining that a chef's knife is terrible for surgery; or how the F-16 can't loiter in the battlefield and deliver strafing fire; or how carbon fibre tubes don't handle compressive strength under water; or using C to do string manipulation and complex regex.
This {{insert_programming_tool_here}} has not worked well for me and the projects I’ve worked on so nobody should use it. And if you do use it you are not a real programmer and you should be ashamed. Because only someone who uses {{isnert_programming_tools_i_like}} can call themselves real programmers. The recipe of these articles.
Python was already the most popular language for carbon-based intelligence, but now it's also becoming the one and only language for silicon-based intelligences.
The future is artificial intelligence programming Python and human programmers writing blog posts about how terrible Python is.
My problem with python is that it's branded as cross-platform when at the end you are required to learn docker and run on a linux environment to really stop suffering.
bragr|2 years ago
Oh and he just says what is supposed to be quiet part at the end:
>And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living. I know it sounds harsh and I apologize for that, but writing software is a profession and hence you should have the knowledge, skill, and experience to use professional tools.
Hear that all data scientists, flask devs, systems engineers, and ML folks? Python is bad so you should quit. ;)
Exuma|2 years ago
The amount of idiotic … implications of his statement is so excruciating it’s physically painful to me. But I encounter this unadaptive unawareness all the time with working with programmers from other teams, etc so I’m used to it.
jrm4|2 years ago
This is the biggest problem in software and it's kind of intractable.
The ideal world has tools that empower everyone to do what they need to do, which to some extent must include an activity like programming.
But, and this may be unconscious, "people who program for a living" have a strong incentive to gatekeep.
jengroff2|2 years ago
dragonwriter|2 years ago
Yeah, this kind of hyperbolic headline article repeating mostly dead-horse arguments is just low-effort click farming, not a serious contribution to the discussion of anything. All it adds to what is a well-trodden debate is…factual errors, like the claim that Python prominently features “lazy evaluation” (while, as in any language, you can arrange laziness in Python, it is very much eager normally.)
rvieira|2 years ago
The ecosystem is the real plus, of course. But the language is a headache for this. I agree with the "false economy" angle. I would happily trade the "agility" of dynamic "gluing" with some kind of real type safety, human-readable error messages and performance[0].
[0] - hiding C in Python's clothes doesn't count :)
zffr|2 years ago
Part of the profession of software engineering is maintaining software that's already written. Should the people who maintain python code, not be paid for their work?
Another part is choosing the right tool for the job. Python has its flaws, but it is better than Go in some ways. For example, it has a richer ecosystem of libraries.
IOT_Apprentice|2 years ago
Why hasn’t the go community, of professional software engineers built an even richer ecosystem of libraries?
Is it ennui, incompetence, or attitude?
As Go came from Google, is that the attitude was, “I am a professional I’ll just write my own code to solve X”, rather than considering building a library that others can use?
Are libraries harder to build in Go? Is adoption of libraries by the Go community different?
Is it a mindset, that libraries are uninteresting?
Or is it something else entirely?
pkoird|2 years ago
dafhagionio|2 years ago
You acknowledge that the kinds of static analysis that are feasible in Python are valuable, but it's "terrible" to want the kinds of static analysis that are infeasible. How interesting that the two boundaries line up exactly.
the_af|2 years ago
Linting has saved my bacon more than once, granted.
anonzzzies|2 years ago
Ocaml is an interesting example imho, as the type inference engine is so good you hardly have to specify types. When you read it and squint, it looks dynamic. It's not.
jerf|2 years ago
What's left are people who don't need that performance, which is sometimes me and is when I still am happy to use Python for something, and people who do need that performance, but don't know it. Those are the ones who get into trouble.
I do wish that the Python developer community would be more open about the fact that Python performance is really quite bad in many cases and that while there are some tools to peck around the edges, it is still fundamentally a low performance language, things like NumPy notwithstanding (it is, ultimately, still a "peck around the edges" tool, even if the particular edge it pecks it does extremely well, but that only makes the performance delta worse when you fall out of its accelerated code path). I feel like maybe back in the 200Xs the community as a whole was more open to that. Now "Python is slow" is perceived by the community as an attack. Maybe because the ones who did understand what that issue was are mostly gone.
But in the 2020s, yes, Python ought to be straight-up eliminated from many tasks people want to do today on straight-up performance grounds. Overcoming a ~40x disadvantage in the single core era was sometimes tough, but often doable. But nowadays basically that gets multiplied by each core on the system, and overcoming a multi-hundred-factor disadvantage is just not worth your time IF that is a level of performance you need. Python is nice, but not nice enough to pay for multiple-hundred-factor performance penalties. Static languages have come a long way since 1995.
the_af|2 years ago
melx|2 years ago
jrm4|2 years ago
"This wrench is really bad for hammering nails!"
the_af|2 years ago
It's just that this article isn't very good at it.
bPspGiJT8Y|2 years ago
JohnFen|2 years ago
> The problem with Python is of course that it is an interpreted language with lazy evaluation
That isn't "the problem" with Python. There's nothing wrong with these sorts of languages. However, it does limit the sorts of problems the language is suited for, and I do see places where Python was used where another language would have produced far better results.
Perhaps using Python inappropriately leads to some thinking that the fault is with the language?
anonzzzies|2 years ago
I know it has functions that are lazy, but it's not lazy as in a sense that Haskell is right? I never use it as I find it a ghastly horror show (my taste, other people like it, that's fine), but I had to use it a few times and found that (also from the article) some parts are lazy, but not python as a language. Is that not correct?
> it does limit the sorts of problems the language is suited for,
Interpreted (...) is the implementation, there is no reason why it should be interpreted.
cjfd|2 years ago
lloydatkinson|2 years ago
https://blog.codinghorror.com/exception-driven-development/a...
https://stackoverflow.com/questions/2184935/performance-cost...
PurpleRamen|2 years ago
What do you expect it to do? Silently fail and move on?
steve_adams_86|2 years ago
I write a lot of Go and I’m used to that. But when I hit snags in the less familiar territory of geographic data processing, it was a slog. Terrible documentation for the libraries was a major barrier, and otherwise it seemed as though essential features simply didn’t exist and I’d have to invent them.
I got the idea to explore Python for the project because people use it for data processing. I’ve used it in the past, though never for this. Whatever, I thought, at the very least I can validate that Go is a suitable tool.
Within a day I had rebuilt everything with Python. I built a flask app around the forecasting and fire perimeter tools, and had it deployed the same evening. It was mind blowing.
As an ecosystem I was absolutely blown away by Python. Do I like the language? Not really. I encountered so many cases where something could be so much faster and more efficient with Go. Deployment would be easier. I’d get more API from the same resources. Scaling would be ten times easier. Static typing tools kept blowing up my IDE because this library doesn’t support this, or the type tool is wrong about that. It was very janky at times.
Yet Python got it done. It’s live right now, and development is steady. Go was not steady and I didn’t see any solution to that in sight without reinventing countless wheels.
dekhn|2 years ago
jwestbury|2 years ago
the_af|2 years ago
Months or years later, it's a beast, hard to understand or refactor, full of holes and pitfalls, and Python's terrible tooling doesn't help either.
And I never learn the lesson!
PurpleRamen|2 years ago
guessbest|2 years ago
> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living.
gjsman-1000|2 years ago
lvncelot|2 years ago
Come on man. There's being opionated, and then there's this.
skywhopper|2 years ago
retrocryptid|2 years ago
variadix|2 years ago
vore|2 years ago
pkoird|2 years ago
mark_l_watson|2 years ago
I agree with the author that there are better languages for large applications.
objektif|2 years ago
If you expect to find Java or C in Python you are looking at the wrong place.
retrocryptid|2 years ago
andrew_eu|2 years ago
> > Unfortunately these workers ran out of memory quickly so we decided to have workers terminate themselves after a configurable number of requests had been handled so that Linux would do our memory management for us.
I once worked with a service in Python that was essentially an image server for a special biomedical image format. The decoder was a proprietary C library written by a vendor and, unsurprisingly, had tons of memory leak issues. This exact feature of Gunicorn [0] saved us months of going back and forth with the vendor to get the library to stop killing our servers.
Python has it's flaws, but so does anything touched by humans.
[0] https://docs.gunicorn.org/en/stable/settings.html#max-reques...
whywhywhywhy|2 years ago
It's extremely frustrating that you're forced into using it to access technology that doesn't even use Python really it's just the composing glue sticking the native C++ or GPGPU code blobs together.
Rochus|2 years ago
I like this comparison. Anyway, it's interesting to note that it took the author "many years of experience running large applications written in Python" to come to his conclusions. The advantages of static typing and the disadvantages of dynamic or duck typing are well known since decades. The problem is less Python as a language, but the fly-by-night decisions to just use it for anything. To stick with the example: what prevents people from using "Lego bricks" (or a high-temperature proof version thereof) to build a reactor? Sound engineering decisions and, most importantly, safety regulations.
NovemberWhiskey|2 years ago
File this one under "things that UNIX systems programmers think will work in principle but end up being massive black-holes of attempting to quiesce any non-trivial application in a way that results in a sensible post-fork state".
retrocryptid|2 years ago
Python is a horrible language because it is not a language. It is a family of languages that changes every October. Sure, 3.x doesn't introduce as many backwards-incompatible changes per version as 2.x did, but that's like saying World War I wasn't horrible because towards the end fewer people seemed to be dying every week.
neoncontrails|2 years ago
nine_zeros|2 years ago
Until it introduced the haphazard type system. Now I need to import types in every file, use IF to guard it in CI in every file, and use a powerful IDE to be able to use the benefits of typing.
notatallshaw|2 years ago
You don't need to do anything, you can ignore all type hints
> use IF to guard it in CI in every file
Are you talking about "if TYPE_CHECKING:"?
Your other option is to put "from __future__ import annotations" at the top of the file, or wait for Python 3.13 when PEP 649 lands and type annotations become lazily evaluated.
kzrdude|2 years ago
I still use python. The recently introduced match statement is a great addition, IMO.
Goofy_Coyote|2 years ago
> Python's use of reference counting defeated copy-on-write because even memory blocks holding variables that were read-only were actually written to in order to manipulate the reference counts, thereby blowing up the combined physical memory footprint of the workers. We solved this by smurfing the interpreter to use a magic reference count number for all variables that were created by the master process and inherited by the workers, and then not touching reference counts that had the magic value.
Thanks
wrs|2 years ago
This would be a nonstarter if it required N+1 times the memory of the single process, so the OS uses an optimization called copy-on-write. When a process forks, all its physical memory is shared by the new process so it takes almost no new memory to start. If the new process writes to a memory page, that physical page is copied so it has its own version. (Thus “copy on write”.)
For most programs this works fine, but if you have a runtime that does garbage collection using a technique that requires writing to an object even if the code doesn’t change any of its values, trouble ensues. With reference counting, you have to write a new reference count for an object anytime a pointer to the object is assigned. If you store the reference count in the object, that means its physical page has to be copied. So now the CoW optimization totally doesn’t work, because just referencing an object causes it to take up additional new memory.
Ruby used to have this same problem, and after Ruby webservers became popular (hello Rails) they eventually incorporated a patch to move the GC information somewhere outside the actual object heap. Other systems like the JVM use similar techniques to store the bookkeeping bits somewhere other than the object field bits.
So what the OP did is patch the runtime so the objects created in the master process (pre-forking) have special reference counts that are never altered. This mostly works, because the master process generally does a bunch of setup so its objects were mostly not going to be garbage anyway.
cpr|2 years ago
Those modifications force pages which were created on forking a child process as copy-on-write (meaning they share the same physical page until the page is modified by the child) to be copied and thus blow out any memory savings that would normally happen with copy-on-write.
timw4mail|2 years ago
My problem with python is its package system, and the mess around the fact it was designed to be global. (I have a similar gripe with Ruby).
Turing_Machine|2 years ago
Ummm... okay.
I'm not going to cheerlead for Python here (in fact I do not like it at all and also avoid it whenever possible) but many of this author's points seem to boil down down to "screwdrivers are bad, here's why you should always use hammers instead".
Different tools exist for different purposes.
dgan|2 years ago
Lack of static typing is nothing in comparison with lack of common sense and unwillingness to learn ("why force oneself ? The job market swallows everything anyway") .
bitsandboots|2 years ago
Whenever some project I find doesn't just work, or works and then a few weeks later stops working, it always seems to be python. People cannot semver properly (even python itself? or is that some programs work on 3.9 and not 3.10 the programmers fault again?) and also cannot lock their dependency versions properly. Same problem that can happen to nodejs code, and yet, I rarely see such basic failures there.
I also just don't understand why anyone would even want to use python, anyway. I've tried to debug python code, or write new python code, but I could never get into it, nor was it easy to read code others had written. Whitespace sensitivity? Special __ files for package purposes? No JIT (no idea why pypy is a separate thing rather than rolled in)? I just don't see the advantage over JS which seems to do everything better. It even has the benefit of being syntactically intuitive for users of other languages, and typescript really has saved JS from chaos. It's fine to be different from C syntactically, but it I don't see the benefit of python to justify that syntax taking up another spot in my brain I could use for learning rust or something.
dragonwriter|2 years ago
I have never found any language community to lack that, and if anything its more where people have lots of experience exclusively in one language than with inexperienced programmers picking up their first (who tend to, by nature, have a willingness to learn, even if they have a lack of the common base of experience that gets labelled “common sense”.)
beaviskhan|2 years ago
_ea1k|2 years ago
Pytorch is a great library too. It is hard to imagine Python decreasing in usage any time soon.
dekhn|2 years ago
I'm glad I never reported to him while at Google.
tamimio|2 years ago
unknown|2 years ago
[deleted]
calyth2018|2 years ago
The post is basically a surgeon complaining that a chef's knife is terrible for surgery; or how the F-16 can't loiter in the battlefield and deliver strafing fire; or how carbon fibre tubes don't handle compressive strength under water; or using C to do string manipulation and complex regex.
You're using the wrong tool for the job.
acheron|2 years ago
delbronski|2 years ago
PurpleRamen|2 years ago
This is the only relevant statement.
intellectronica|2 years ago
The future is artificial intelligence programming Python and human programmers writing blog posts about how terrible Python is.
somecommit|2 years ago
retrocryptid|2 years ago
kelsolaar|2 years ago
quantumstar4k|2 years ago
[deleted]