top | item 18442339

Copying objects in JavaScript

124 points| wheresvic1 | 7 years ago |smalldata.tech

92 comments

order
[+] mrgalaxy|7 years ago|reply
I've been programming JS for a very long time and have learned to just stop trying to do a generic deep copy. Since JS is a dynamically typed language, it will always lead to issues down the road. Instead I write domain specific merge methods for whatever objects I'm merging.

    function mergeOptions(...options) {
      const result = {};

      for (const opt of options) {
        result = {
          ...result,
          ...opt
          arrayValue: [
            ...(result.arrayValue || []),
            ...(opt.arrayValue || [])
          ],
          deepObject: {
            ...result.deepObject,
            ...opt.deepObject
          }
        };
      }

      return result;
    }
Know the shape of your objects and merging deeply becomes painless and won't have edge-cases.
[+] ben509|7 years ago|reply
Another way of looking at it: if you are frequently doing complex copies, you probably want immutable types.
[+] cageface|7 years ago|reply
Immerjs is a very handy library for doing this kind of thing. It is a natural fit for react but can be used for any kind of copying like this.
[+] lxe|7 years ago|reply
I think it's not the best idea to keeping the shapes of objects in your head and manually cloning/merging them.

This will lead to bugs, as inadvertently as a human you'll miss a merge or a clone and retain references that you don't want.

Inability of a language or runtime to correctly and quickly clone a structure is an upsetting fact of JavaScript.

[+] Null-Set|7 years ago|reply
As of version 8.0.0 node has exposed a serialization api which is compatible with structured clone. https://nodejs.org/api/v8.html#v8_serialization_api

    const v8 = require('v8');
    const buf = v8.serialize({a: 'foo', b: new Date()});
    const cloned = v8.deserialize(buf);
    cloned.b.getMonth();
[+] devoply|7 years ago|reply
Have we learned nothing from Java's serialization fiasco?
[+] wheresvic1|7 years ago|reply
That's awesome, I'll update the article to reflect this!
[+] malcolmwhite|7 years ago|reply
Using structured cloning for deep copies is clever, but may or may not give you the behavior you want for SharedArrayBuffers. The copied value would be a new SAB with the same underlying data buffer, so that changes to one value will be visible to the other. That's good for most uses of structured cloning, but it's not what I would expect from a deep copy.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

[+] Null-Set|7 years ago|reply
The point of structured clone was originally for passing data to web workers. Since the point of shared array buffers is to share data with workers, it makes sense that the structured clone algorithm keeps the SAB identity.
[+] bcoates|7 years ago|reply
Programs that use SharedArrayBuffers are already defective, so it's no big deal.
[+] bcoates|7 years ago|reply
Fundamentaly, deep-copy in Javascript is a typed operation and no "generic" deep-copy algorithm is possible--you have to know what meaning the value is supposed to have to copy it.

There's nothing inherently wrong with structured clone, but it's only for JSON-able objects with extensions for circular references and some built in value-like classes. It's also special-cased for safe transmission between Javascript domains so it has a bunch of undesirable behavior for local copies. (no dom, no functions, no symbols, no properties...)

Even primitive types can't be safely copied unless you know what they're going to be used for later, a nodejs file descriptor is just an integer but it's also a reference to an OS resource that can't be duplicated without a syscall.

[+] amelius|7 years ago|reply
> Fundamentaly, deep-copy in Javascript is a typed operation and no "generic" deep-copy algorithm is possible--you have to know what meaning the value is supposed to have to copy it.

Why? I see no problem if the deep-copy behaves exactly the same as the original, from the perspective of any operation in the Javascript API (except for the === operator).

[+] dan-robertson|7 years ago|reply
It is fundamentally very hard (impossible?) to deep copy everything in JavaScript. References cannot be escaped because they may be hidden in non-introspectable (cruicially, non cloneabel) places. Viz:

  function f(){var x = 0; return function(){return x++;};}
  var x = {foo:f()};
  print(x.foo());
  var y = {foo:f()};
  print(y.foo());
  var z = someDeepCopy(y);
  print(z.foo());
  print(x.foo());
  print(y.foo());
  print(z.foo());
If a copy were sufficiently deep then one could expect:

  0
  0
  1
  1
  1
  2
However if it were not deep one would get:

  0
  0
  1
  1
  2
  3
Even if one allows a deep copying of closures then this still might not work as an object which contains two (potentially different) functions closing over the same binding (ie particular instance of a particular variable) may be copied into two functions each closing over their own separate binding.

I think the only good solution to this is to either give up trying to do deep copies or give up immutability and stop caring about deep copies.

[+] russellbeattie|7 years ago|reply
The language really should have a true immutable type (without freezing, etc.) and deep copy method built in, with as many caveats and parameters as needed. Coroutines would be awesome as well. (Yes, I'm thinking, "How could JavaScript be more like Go or Erlang?")

And then it needs to stop adding new features for at least a couple years so the world can catch up.

[+] yuchi|7 years ago|reply
I infer you don't mean to have both immutable data structures and deep copy as features to use together, since immutables don't need to be cloned.
[+] TheAceOfHearts|7 years ago|reply
You can implement coroutines using generators. Is there any feature you'd miss from other implementations if you used generators in that way?
[+] SonicSoul|7 years ago|reply
i guess most languages do not add this because of circular reference problem?
[+] nobody271|7 years ago|reply
var copy = JSON.parse(JSON.stringify(myObj));

Anything beyond this and you are begging for trouble because there's always context-specific gotchas.

[+] pnevares|7 years ago|reply
This is presented in the linked document with the following context-specific gotchas:

> Unfortunately, this method only works when the source object contains serializable value types and does not have any circular references. An example of a non-serializable value type is the Date object - it is printed in a non ISO-standard format and cannot be parsed back to its original value :(.

[+] austincheney|7 years ago|reply
Why? Why would people want to clone objects? When I have encountered this in the past it is from people who are new to the language.

My advise to any person who really believes they need a clone of an object: do some self-reflection on plan as to why you think you need a cloned object. Any other approach is more efficient and more simple in the code.

Objects are hash maps that store data.

[+] sebringj|7 years ago|reply
Yah that's why I use the serializable override:

someObj.toJSON = function() { return { foo: this.foo, bar: this.bar } }

It is an extra step but when using redux or something like that you have to serialize stuff anyway to store it and is especially useful for mobile react-native stuff in keeping state when phone restarts or connection fails.

[+] Scarbutt|7 years ago|reply
Another option if you can afford it is to just use immutablejs.

Or make your functions return new objects.

[+] nine_k|7 years ago|reply
I suppose immutable.js supports partial reuse of objects, that is, if you only change the value of one attribute, the changed copy has this attribute set differently, but the rest is shallow-copied?

If so, indeed immutable objects would not run into the problem of copying, as long as you can afford them to be immutable. (That is, you're not working with any APIs that assume and use mutability.)

[+] TheAceOfHearts|7 years ago|reply
ImmutableJS is a pretty huge library. I'd conjecture most web apps don't have a legitimate need for something so comprehensive and would be better off with a simpler solution.

If you know the shapes of your objects ahead of time you can create one-off functions, which will probably be faster and require far less code.

[+] beaconstudios|7 years ago|reply
I cannot recommend ramda enough. It provides the immutability and flexibility of immutablejs, but because it's build in a functional paradigm, the logic is fully composeable so complex, deeply nested changes are very simple and readable.
[+] ben509|7 years ago|reply
Granted, this is Python, but I wrote this a while back: https://github.com/scooby/pyrsistent-mutable

Basically, it's an AST translator that lets you use imperative syntax against immutable types. That is, `x.a = b` becomes the clunky `x = x.set('a', b)`, and it really gets convenient when you have complex structures.

Would it be worth it to look into a babel plugin for Javascript and ImmutableJS?

[+] chrisseaton|7 years ago|reply
> objects in Javascript are simply references to a location in memory

No variables are simply references to objects. Objects aren't references - they're referents.

[+] barrystaes|7 years ago|reply
I like the shallow copy, and never needed a deep copy. I am using JS for a few years tops, mostly React.

To me its exactly what native languages do with pointers. In some languages (like Delphi) its implicit (like JS) and some (like C) its explicit in syntax.

[+] KaoruAoiShiho|7 years ago|reply
Been using this for a while: https://stackoverflow.com/a/44612374/663447 Best clone imo (for the correct usecases).
[+] simlevesque|7 years ago|reply
Don't use it with dates, it breaks them. You'll lose the data.

# var a = new Date();

# cloneDeep({ a }).a === { a }.a;

This returns false. Use JSON.stringify if you care about the content. cloneDeep might be useful if you don't care about data integrity.

[+] freeopinion|7 years ago|reply
I followed your link, which has links to two perf sites, which show that Object.assign is nearly 20x more performant than your preferred solution.
[+] kalmi10|7 years ago|reply
Fun fact: Not even the whole of number type is safe to clone with the JSON method, because Infinity or NaN turn into null.

So one can’t infer JSON-clonability from TypeScript/JavaScript types. Learned this the hard way.

[+] z3t4|7 years ago|reply
Premature optimizations is the root of all evil. That said, creating new objects do slow your code down, so do it only when you have a good reason to.
[+] iLemming|7 years ago|reply
Every single time I see a similar article on Javascript, I feel so lucky for being able to use Clojurescript instead. Seriously - Javascript platform is great. The language itself? Not so nice. Clojurescript makes so many things simply better.
[+] stevebmark|7 years ago|reply
Never do any of this. It's not 1990 anymore, we don't copy code from random blog posts to solve problems.
[+] freeopinion|7 years ago|reply
> x = 4

4

> y = new (x.constructor)(x)

[Number: 4]

> x.constructor

[Function: Number]

> y.constructor

[Function: Number]

> typeof x

'number'

> typeof y

'object'

> x

4

> y

[Number: 4]

Is y a clone of x?

[+] snek|7 years ago|reply
no its boxed. take `y.valueOf()` and you've got a successful clone.
[+] jackconnor|7 years ago|reply
One of the weird, interesting parts of JS. Great article.