top | item 44622345

(no title)

GeneralMayhem | 7 months ago

Yeah, I know that's how it works under the hood - and why you have things like all integers with values in [-5, 256] being assigned to the pre-allocated objects - but I don't think it's a particularly useful model for actually programming. "Pass-by-reference with copy-on-write" is semantically indistinguishable from "pass-by-value".

discuss

order

zahlman|7 months ago

There is no copy on write and no pass by reference.

Python is "pass by value", according to the original, pedantic sense of the term, but the values themselves have reference semantics (something that was apparently not contemplated by the people coming up with such terminology — even though Lisp also works that way). Every kind of object in Python is passed the same way. But a better term is "pass by assignment": passing a parameter to an argument works the same way as assigning a value to a variable. And the semantic distinctions you describe as nonexistent are in fact easy to demonstrate.

The model is easy to explain, and common in modern programming languages. It is the same as non-primitive types in Java (Java arrays also have these reference semantics, even for primitive element types, but they also have other oddities that arguably put them in a third category), or class instances (as opposed to struct instances, which have value semantics) in C# (although C# also allows both of these things to be passed by reference).

The pre-allocated integer objects are a performance optimization, nothing to do with semantics.

The model is useful for programming, because it's correct. We know that Python does not pass by reference because you cannot affect a caller's local variable, and thus cannot write a "swap" function. We know that Python copies the references around, rather than cloning objects, because you still can modify the object named by a caller's local variable. We know that no copy-on-write occurs because we can trivially set up examples that share objects (including common gotchas like https://stackoverflow.com/questions/240178).

pansa2|7 months ago

> I don't think it's a particularly useful model for actually programming

I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”. As you say, the latter is the case in Go, and it’s one of the few ways the language is more complex than Python.

You could argue that in Python you still have to learn which objects are mutable and which are immutable - but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.

zahlman|7 months ago

> I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”.

The model is: everything is a reference (more accurately, has reference semantics); and is passed by assignment ("value", in older, cruder terms).

> but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.

`+=` is implemented by `__iadd__` where available (which is expected to be mutating) and by `__add__` as a fallback (https://stackoverflow.com/questions/2347265). The re-assignment of the result is necessary to make the fallback work. Reference for your "major wtf" is https://stackoverflow.com/questions/9172263.