Not having a VDOM is only a pro if your library maintains feature parity with libraries that have a VOM. Otherwise, you just lack whatever it is that VDOMs enable. This library seems to forget why view libraries use a VDOM. In this library, you have to write imperative update logic, which the VDOM specifically allows you to avoid. The VDOM is a slight performance hit most people accept in order to be able to write declarative code.
In this case, I find it a bit misleading to pass off not having a VDOM as a pro. It's a bit like building a social media site with no users and claiming everyone should join because it doesn't have spam.
In my opinion the benefits of declarative GUI are vastly overstated for the average web app. Given that your application is appropriately componentized, it rarely makes things much clearer.
We're actually using this. Re-dom is a very thin layer on top of the DOM. Basically all it does is hide some of the weirdness of that.
The best way of using it is simply emulating what you would do in react (without jsx) in terms of design patterns. So you have component classes, with state, and a render method. Half of the success is just using good patterns like that. All redom does is allow you to create trees of elements and manage those.
It has a few simple primitives for that. The main thing is a an el method that takes 3 parametes (maximum). The first is the element name, classes and id as a string. ".foo.bar" means a div with class foo and bar. "span.foo" means a span with class foo, and so on. The second (optional) one is an object with attributes. So if you have an a tag, you might want to pass in an object with an href, title, target, etc. The last parameter is either a string for text nodes, another element, or a list of elements. There are setAttr, setChildren, etc. methods you can call on elements. There are mount/unmount methods and there is some syntactic sugar for managing lists. That's about it.
You can shoot yourself in the foot (just like with other frameworks) but otherwise this works as advertised. Mostly there is not a lot of magic happening in terms of expensive stuff.
I've been using plain JavaScript more and more lately. Luckily Element.protoype has this nice property where if you attempt to read some setter functions it will throw, making it cake to implement a chainable createElement.
Huh, one day I forked a part of RE:DOM called nodom[1], to use it in my library for rendering D3 charts in the background worker. I'd been extending nodom, and finally got working D3 in worker[2], sorry the pun. As I recall, nodom was pretty straightforwardly coded w/o magic and easy to extend, a nice library if you need to play with lightweight vdom.
This idea of creating DOM with some API is not new and never really understood it. Just look at the login class, it's way more complex than plain HTML, hard to read and messes with .this.
If you want DOM in your code, JSX is the way to go.
With more complex project there’s benefits of having purely updatable components, the support of just native JS without quirks and knowing exactly what’s happening and when without black magic.
But if course it always depends on the project which way is better.
Two days ago I started an HTML application. One day ago I realized, I'd want to add localization to it. Today I realized, that I may be better off to have the whole document be created dynamically, since I can then easily implement localization. Note, this is an "app" not a "doc".
If manipulating the DOM directly is so much faster than using a virtual DOM, then why do libraries like React or Ember use a virtual DOM in the first place? I honestly though it was for speed.
Seems similar to Backbone.js which was popular a looong time ago. Well, in fact, it's probably 4 to 6 years but it's a really long time for the front-end world. :)
For those in the know on the internals of the topic library: is this like hyperHTML or lit-html in the way that it remembers the "holes" in the html so every update is just the update of live DOM at the right place?
It’s a cool approach, and this one looks more fleshed out. But I’d still think in 2018 it’s almost always a premature/inappropriate optimization to forgo a virtual DOM, unless you’re doing something pretty far out there.
It looks like the Element.update function is more or less the equivalent of setState in React. So the UI does seem to be declarative (you may want to emphasize this, at first it looked like a jQuery-like library). How does the declarative aspect of this work without a virtual dom?
Can anyone explain why you’d use this over just writing the HTML? This reminds me of the days when there were similar libraries in PHP and I could never find a use for them. It was easier and faster to just write the HTML.
Loving it, built something similar a long time ago.
Can you use it the other way around tho? Like, take an HTML string and return its RE:DOM representation.
> Because RE:DOM is so close to the metal and doesn't use virtual dom, it's actually faster and uses less memory than almost all virtual dom based libraries, including React (benchmark).
Very well said. Author of "one very popular library" is plainly lying claiming that "Virtual DOM" was somehow faster than real DOM.
That was never the case, even back in IE6 era, aside from very few and well known edge cases.
Switch to Virtual DOM led to one of the biggest slowdowns in website performance.
IIRC the performance gain is when doing things like updating the state of a deep component tree, where react and others diff on the vdom to see what elements need to be re-rendered and then only so actual dom manipulation on those, rather than clear and re-render the entire tree.
I’ll be honest, though, it feels to me like a solution to a hard problem you inflict on yourself by first committing to have the giant component tree rather than questioning if that was a great idea.
There was no notion of a virtual dom in the ie6 era that I can remember. Unless Prototype (the precursor to jquery) used one and I just wasn’t aware of it. I believe the libraries of that time either directly extended the dom or wrapped individual elements, but did not maintain anything resembling virtual doms as we know them now.
Personally I was quite surprised by the good performance of React 16 + Preact in the benchmarks. It's not much slower and does all the "update" logic for you.
Keep in mind, too, React handles MANY edge cases for scenarios like keeping scroll position on update, or animations.
I have only one upvote to give you but thank you for your comment. I am not very familiar with the Js world and it's riddled with vocal minority of not-so-knowledgeable people and I had no clue that the vdom fad wasn't actually reasonable.
I can't comment on the quality of the library, but I did get a quick chuckle from the term "close to the metal" being used to describe a Javascript browser library.
i dont want to convert HTML into any other format...its like coffeescript. somebody is gonna pick up your code and they see all these el('h1','OMFG HN SUX'); instead of normal sane HTML.
This..._isn't_ converting HTML into any other format. It's providing an ergonomic way to generate elements in javascript. The alternative is directly using `myEl = document.createElement(...)` and `myEl.appendChild(...)`. and `myEl.src = 'whatever'`. You're comparing apples and dumptrucks here.
[+] [-] anonytrary|7 years ago|reply
In this case, I find it a bit misleading to pass off not having a VDOM as a pro. It's a bit like building a social media site with no users and claiming everyone should join because it doesn't have spam.
[+] [-] giornogiovanna|7 years ago|reply
[+] [-] jillesvangurp|7 years ago|reply
The best way of using it is simply emulating what you would do in react (without jsx) in terms of design patterns. So you have component classes, with state, and a render method. Half of the success is just using good patterns like that. All redom does is allow you to create trees of elements and manage those.
It has a few simple primitives for that. The main thing is a an el method that takes 3 parametes (maximum). The first is the element name, classes and id as a string. ".foo.bar" means a div with class foo and bar. "span.foo" means a span with class foo, and so on. The second (optional) one is an object with attributes. So if you have an a tag, you might want to pass in an object with an href, title, target, etc. The last parameter is either a string for text nodes, another element, or a list of elements. There are setAttr, setChildren, etc. methods you can call on elements. There are mount/unmount methods and there is some syntactic sugar for managing lists. That's about it.
You can shoot yourself in the foot (just like with other frameworks) but otherwise this works as advertised. Mostly there is not a lot of magic happening in terms of expensive stuff.
[+] [-] jtms|7 years ago|reply
[+] [-] mr_toad|7 years ago|reply
And that’s how new Javascript frameworks are born.
[+] [-] minieggs|7 years ago|reply
Fun stuff, https://github.com/mini-eggs/ogle-tr-122b
[+] [-] kuroguro|7 years ago|reply
[+] [-] oh_sigh|7 years ago|reply
[+] [-] Eli_P|7 years ago|reply
[1] https://github.com/ptytb/nodom [2] https://github.com/ptytb/d3-worker
[+] [-] kowdermeister|7 years ago|reply
If you want DOM in your code, JSX is the way to go.
[+] [-] pkstn|7 years ago|reply
But if course it always depends on the project which way is better.
[+] [-] zmix|7 years ago|reply
Too bad I am forced to use ES5 (IE9).
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] xtagon|7 years ago|reply
[+] [-] buremba|7 years ago|reply
[+] [-] pkstn|7 years ago|reply
[+] [-] kreetx|7 years ago|reply
[+] [-] pkstn|7 years ago|reply
[+] [-] guscost|7 years ago|reply
It’s a cool approach, and this one looks more fleshed out. But I’d still think in 2018 it’s almost always a premature/inappropriate optimization to forgo a virtual DOM, unless you’re doing something pretty far out there.
[+] [-] JustSomeNobody|7 years ago|reply
[+] [-] lopatin|7 years ago|reply
[+] [-] lopatin|7 years ago|reply
[+] [-] Zelphyr|7 years ago|reply
[+] [-] cfv|7 years ago|reply
[+] [-] baybal2|7 years ago|reply
Very well said. Author of "one very popular library" is plainly lying claiming that "Virtual DOM" was somehow faster than real DOM.
That was never the case, even back in IE6 era, aside from very few and well known edge cases.
Switch to Virtual DOM led to one of the biggest slowdowns in website performance.
[+] [-] burlesona|7 years ago|reply
I’ll be honest, though, it feels to me like a solution to a hard problem you inflict on yourself by first committing to have the giant component tree rather than questioning if that was a great idea.
[+] [-] jtms|7 years ago|reply
[+] [-] jchook|7 years ago|reply
Keep in mind, too, React handles MANY edge cases for scenarios like keeping scroll position on update, or animations.
[+] [-] jackblack8989|7 years ago|reply
[+] [-] royjacobs|7 years ago|reply
[+] [-] codeflo|7 years ago|reply
[+] [-] fold_left|7 years ago|reply
This and "blazing fast" make me cringe every time.
[+] [-] pkstn|7 years ago|reply
As well as ”turboboosted”. Don’t take them too seriously :D
[+] [-] entelechy|7 years ago|reply
There is wasmjit [1] a kernel module that enables execution of webassembly via the linux kernel.
Furthermore there is AssemblyScript [2] that allows to transpile typed javascript to WebAssembly
[1] https://github.com/rianhunter/wasmjit
[2] https://github.com/AssemblyScript/assemblyscript
[+] [-] megaremote|7 years ago|reply
[+] [-] bigiain|7 years ago|reply
https://esp32.com/viewtopic.php?t=497
[+] [-] aaaaaaaaaab|7 years ago|reply
Lol. Sure! Just like how I get closer to the Earth’s core if I’m sitting on the floor instead of my chair.
[+] [-] pkstn|7 years ago|reply
[+] [-] ziont|7 years ago|reply
don't reinvent the wheel man.
[+] [-] cpfohl|7 years ago|reply