I've been programming JS for a very long time and have learned to just stop trying to do a generic deep copy. Since JS is a dynamically typed language, it will always lead to issues down the road. Instead I write domain specific merge methods for whatever objects I'm merging.
function mergeOptions(...options) {
const result = {};
for (const opt of options) {
result = {
...result,
...opt
arrayValue: [
...(result.arrayValue || []),
...(opt.arrayValue || [])
],
deepObject: {
...result.deepObject,
...opt.deepObject
}
};
}
return result;
}
Know the shape of your objects and merging deeply becomes painless and won't have edge-cases.
Using structured cloning for deep copies is clever, but may or may not give you the behavior you want for SharedArrayBuffers. The copied value would be a new SAB with the same underlying data buffer, so that changes to one value will be visible to the other. That's good for most uses of structured cloning, but it's not what I would expect from a deep copy.
The point of structured clone was originally for passing data to web workers. Since the point of shared array buffers is to share data with workers, it makes sense that the structured clone algorithm keeps the SAB identity.
Fundamentaly, deep-copy in Javascript is a typed operation and no "generic" deep-copy algorithm is possible--you have to know what meaning the value is supposed to have to copy it.
There's nothing inherently wrong with structured clone, but it's only for JSON-able objects with extensions for circular references and some built in value-like classes. It's also special-cased for safe transmission between Javascript domains so it has a bunch of undesirable behavior for local copies. (no dom, no functions, no symbols, no properties...)
Even primitive types can't be safely copied unless you know what they're going to be used for later, a nodejs file descriptor is just an integer but it's also a reference to an OS resource that can't be duplicated without a syscall.
> Fundamentaly, deep-copy in Javascript is a typed operation and no "generic" deep-copy algorithm is possible--you have to know what meaning the value is supposed to have to copy it.
Why? I see no problem if the deep-copy behaves exactly the same as the original, from the perspective of any operation in the Javascript API (except for the === operator).
It is fundamentally very hard (impossible?) to deep copy everything in JavaScript. References cannot be escaped because they may be hidden in non-introspectable (cruicially, non cloneabel) places. Viz:
function f(){var x = 0; return function(){return x++;};}
var x = {foo:f()};
print(x.foo());
var y = {foo:f()};
print(y.foo());
var z = someDeepCopy(y);
print(z.foo());
print(x.foo());
print(y.foo());
print(z.foo());
If a copy were sufficiently deep then one could expect:
0
0
1
1
1
2
However if it were not deep one would get:
0
0
1
1
2
3
Even if one allows a deep copying of closures then this still might not work as an object which contains two (potentially different) functions closing over the same binding (ie particular instance of a particular variable) may be copied into two functions each closing over their own separate binding.
I think the only good solution to this is to either give up trying to do deep copies or give up immutability and stop caring about deep copies.
The language really should have a true immutable type (without freezing, etc.) and deep copy method built in, with as many caveats and parameters as needed. Coroutines would be awesome as well. (Yes, I'm thinking, "How could JavaScript be more like Go or Erlang?")
And then it needs to stop adding new features for at least a couple years so the world can catch up.
This is presented in the linked document with the following context-specific gotchas:
> Unfortunately, this method only works when the source object contains serializable value types and does not have any circular references. An example of a non-serializable value type is the Date object - it is printed in a non ISO-standard format and cannot be parsed back to its original value :(.
Why? Why would people want to clone objects? When I have encountered this in the past it is from people who are new to the language.
My advise to any person who really believes they need a clone of an object: do some self-reflection on plan as to why you think you need a cloned object. Any other approach is more efficient and more simple in the code.
It is an extra step but when using redux or something like that you have to serialize stuff anyway to store it and is especially useful for mobile react-native stuff in keeping state when phone restarts or connection fails.
I would recommend immer (https://github.com/mweststrate/immer) instead of ImmutableJS. You can work with regular JS objects, plus it plays much nicer with TypeScript.
I suppose immutable.js supports partial reuse of objects, that is, if you only change the value of one attribute, the changed copy has this attribute set differently, but the rest is shallow-copied?
If so, indeed immutable objects would not run into the problem of copying, as long as you can afford them to be immutable. (That is, you're not working with any APIs that assume and use mutability.)
ImmutableJS is a pretty huge library. I'd conjecture most web apps don't have a legitimate need for something so comprehensive and would be better off with a simpler solution.
If you know the shapes of your objects ahead of time you can create one-off functions, which will probably be faster and require far less code.
I cannot recommend ramda enough. It provides the immutability and flexibility of immutablejs, but because it's build in a functional paradigm, the logic is fully composeable so complex, deeply nested changes are very simple and readable.
Basically, it's an AST translator that lets you use imperative syntax against immutable types. That is, `x.a = b` becomes the clunky `x = x.set('a', b)`, and it really gets convenient when you have complex structures.
Would it be worth it to look into a babel plugin for Javascript and ImmutableJS?
I like the shallow copy, and never needed a deep copy. I am using JS for a few years tops, mostly React.
To me its exactly what native languages do with pointers. In some languages (like Delphi) its implicit (like JS) and some (like C) its explicit in syntax.
Every single time I see a similar article on Javascript, I feel so lucky for being able to use Clojurescript instead.
Seriously - Javascript platform is great. The language itself? Not so nice. Clojurescript makes so many things simply better.
[+] [-] mrgalaxy|7 years ago|reply
[+] [-] ben509|7 years ago|reply
[+] [-] cageface|7 years ago|reply
[+] [-] lxe|7 years ago|reply
This will lead to bugs, as inadvertently as a human you'll miss a merge or a clone and retain references that you don't want.
Inability of a language or runtime to correctly and quickly clone a structure is an upsetting fact of JavaScript.
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] Null-Set|7 years ago|reply
[+] [-] devoply|7 years ago|reply
[+] [-] wheresvic1|7 years ago|reply
[+] [-] malcolmwhite|7 years ago|reply
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
[+] [-] Null-Set|7 years ago|reply
[+] [-] bcoates|7 years ago|reply
[+] [-] bcoates|7 years ago|reply
There's nothing inherently wrong with structured clone, but it's only for JSON-able objects with extensions for circular references and some built in value-like classes. It's also special-cased for safe transmission between Javascript domains so it has a bunch of undesirable behavior for local copies. (no dom, no functions, no symbols, no properties...)
Even primitive types can't be safely copied unless you know what they're going to be used for later, a nodejs file descriptor is just an integer but it's also a reference to an OS resource that can't be duplicated without a syscall.
[+] [-] amelius|7 years ago|reply
Why? I see no problem if the deep-copy behaves exactly the same as the original, from the perspective of any operation in the Javascript API (except for the === operator).
[+] [-] dan-robertson|7 years ago|reply
I think the only good solution to this is to either give up trying to do deep copies or give up immutability and stop caring about deep copies.
[+] [-] russellbeattie|7 years ago|reply
And then it needs to stop adding new features for at least a couple years so the world can catch up.
[+] [-] yuchi|7 years ago|reply
[+] [-] TheAceOfHearts|7 years ago|reply
[+] [-] SonicSoul|7 years ago|reply
[+] [-] nobody271|7 years ago|reply
Anything beyond this and you are begging for trouble because there's always context-specific gotchas.
[+] [-] pnevares|7 years ago|reply
> Unfortunately, this method only works when the source object contains serializable value types and does not have any circular references. An example of a non-serializable value type is the Date object - it is printed in a non ISO-standard format and cannot be parsed back to its original value :(.
[+] [-] austincheney|7 years ago|reply
My advise to any person who really believes they need a clone of an object: do some self-reflection on plan as to why you think you need a cloned object. Any other approach is more efficient and more simple in the code.
Objects are hash maps that store data.
[+] [-] sebringj|7 years ago|reply
someObj.toJSON = function() { return { foo: this.foo, bar: this.bar } }
It is an extra step but when using redux or something like that you have to serialize stuff anyway to store it and is especially useful for mobile react-native stuff in keeping state when phone restarts or connection fails.
[+] [-] Scarbutt|7 years ago|reply
Or make your functions return new objects.
[+] [-] realharo|7 years ago|reply
[+] [-] nine_k|7 years ago|reply
If so, indeed immutable objects would not run into the problem of copying, as long as you can afford them to be immutable. (That is, you're not working with any APIs that assume and use mutability.)
[+] [-] TheAceOfHearts|7 years ago|reply
If you know the shapes of your objects ahead of time you can create one-off functions, which will probably be faster and require far less code.
[+] [-] beaconstudios|7 years ago|reply
[+] [-] ben509|7 years ago|reply
Basically, it's an AST translator that lets you use imperative syntax against immutable types. That is, `x.a = b` becomes the clunky `x = x.set('a', b)`, and it really gets convenient when you have complex structures.
Would it be worth it to look into a babel plugin for Javascript and ImmutableJS?
[+] [-] chrisseaton|7 years ago|reply
No variables are simply references to objects. Objects aren't references - they're referents.
[+] [-] barrystaes|7 years ago|reply
To me its exactly what native languages do with pointers. In some languages (like Delphi) its implicit (like JS) and some (like C) its explicit in syntax.
[+] [-] KaoruAoiShiho|7 years ago|reply
[+] [-] simlevesque|7 years ago|reply
# var a = new Date();
# cloneDeep({ a }).a === { a }.a;
This returns false. Use JSON.stringify if you care about the content. cloneDeep might be useful if you don't care about data integrity.
[+] [-] freeopinion|7 years ago|reply
[+] [-] kalmi10|7 years ago|reply
So one can’t infer JSON-clonability from TypeScript/JavaScript types. Learned this the hard way.
[+] [-] z3t4|7 years ago|reply
[+] [-] iLemming|7 years ago|reply
[+] [-] stevebmark|7 years ago|reply
[+] [-] freeopinion|7 years ago|reply
4
> y = new (x.constructor)(x)
[Number: 4]
> x.constructor
[Function: Number]
> y.constructor
[Function: Number]
> typeof x
'number'
> typeof y
'object'
> x
4
> y
[Number: 4]
Is y a clone of x?
[+] [-] snek|7 years ago|reply
[+] [-] jackconnor|7 years ago|reply