15

Show HN: I made React with a faster Virtual DOM

 1 year ago
source link: https://news.ycombinator.com/item?id=31576634
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Show HN: I made React with a faster Virtual DOM

Show HN: I made React with a faster Virtual DOM
92 points by aidenyb 14 hours ago | hide | past | favorite | 67 comments
Hi! I made a React compatibility library for a Virtual DOM library (https://github.com/aidenybai/million).

The idea is to have much faster rendering (a compiler optimizes virtual DOM beforehand) while ensuring the same developer experience React provides.

This is very, VERY early stage, so be prepared for weird bugs / plugin incompatibility / etc. If you have any suggestions, I'd be more than happy if you replied in a comment with it!

You can spin up the demo here >> https://stackblitz.com/github/aidenybai/million-react-compat

The fastest virtual DOM is no virtual DOM at all.

Rather than creating and diffing a fine-grained tree of elements every render, it's very easy to use the syntactic structure of a template to see exactly what parts can and cannot change. To get stable and minimal DOM updates you just compare the template identity to the previously rendered template - if they match you update the dynamic expressions, if they don't you clear and render the new template.

This is what we did with lit-html and it's quite a bit faster (and smaller) than React and doesn't require a compiler because it uses standard JS tagged template literals. https://lit.dev/docs/libraries/standalone-templates/

It's a very simple approach and very, very hard to beat in the fast/small/simple/buildless tradeoff space. I hope one day that the DOM can include standard updatable DOM with a technique like this on top of the template instantiation proposal from Apple. It's such a common need that it should be built in.

s.gif
As a representative (?) of a popular library, it's not a great look to come in and put-down someone's Show HN project. Especially when you're also dismissing the entire category of libraries as inferior, while the reality is that both approaches have different costs and benefits
s.gif
Is he putting it down?

He's arguing about speed and saying why the approach of lit-html will always be faster than virtual dom.

I think it's fair since OP is trying to get a faster vdom to also expect discussions about different approaches and I'm glad the previous user gave his two cents.

s.gif
While I agree in general (there are better/faster approaches than virtual dom), it’s not very relevant to a Show HN about React-compatible performance improvements. Lit is not compatible with React, as far as I know. If there’s a claim that Lit’s approach can be compatible with React’s API, that would be a different matter.

To me this makes the comment come off as hijacking the thread to promote their own library, which really is quite rude. Not claiming this was spankalee’s intent, but you see this quite often on HN, people who keep mentioning their own product or library in barely tangentially related discussions.

s.gif
Relevant post: https://status451.com/2016/01/06/splain-it-to-me/

There are two common social perspectives: information sharing vs emotional harmony. From one perspective, the other seems rude or insane. If your message is corrected or added to, this is a chance to be less wrong. But it is also a chance to be embarrassed and seen as less knowledgeable than you seemed.

Status seekers tend to assume the latter perspective, and therefor label such comments as rude, dismissive, conflict-seeking, etc. even if the OP had no such intention.

This is also 95% of what "help i'm being harassed online" comes down to.

s.gif
Oh this explains so much. I've been raised in the optics that one should always share information in order to make the world better, didn't even think there were other reasons to, like, talk in public.
s.gif
Since you’re not aware, a very common reason to talk in public is to gain something from one’s talking. The situation at hand looks a lot like that, and in particular gaining something at the expense of someone else in this case. If every ShowHN ended up directing everyone to an alternative project, no one would bother making Show HNs. Hence why it’s questionable about whether the top comment makes the world a better place, etc. That’s also why someone not apparently related to the library at hand could make the same comment without it being negatively received because they don’t appear to have anything to gain from it. I would instead encourage the poster to make their own Show HN the next day rather than hijacking an existing Show HN.
s.gif
I don't know, it seems pretty common for people to be straight up on HN. I didn't get a sense of superiority or insult from the comment - just statements that the author believes to be true.
s.gif
> seems pretty common for people to be straight up on HN.

Being straight about it, I would describe most of these people as "nitpicking carmudgeons" and it's something that frustrates me hugely about HN. Anything new shown here - no matter how careful the poster is to make clear that they already know it has shortcomings and that it's just a prototype or whatever - and you can be virtually guaranteed the top comments will be putting it down and nitpicking issues with it, rather than discussing any possible benefits or improvements, or, heaven forfend, offering constructive criticism to the creator.

s.gif
I agree, it is a negative space overall, and the most negative comments get upvoted
s.gif
I know that HN believes it’s immune to this, but being right isn’t an excuse to forgo social expectations
s.gif
Different social environments have different social expectations. HN has a social expectation that focusing heavily on technical considerations of how something can be done better will lead to pleasant shop-talking like OP's response here: https://news.ycombinator.com/item?id=31576634#31578949
s.gif
> I know that HN believes it’s immune to this, but being right isn’t an excuse to forgo social expectations

Could you please offer some insight on why trying to pass off technically wrong claims in a technical forum should be immune to any informative comment clarifying or clearing up misconceptions ?

s.gif
> why trying to pass off technically wrong claims in a technical forum should be immune to any informative comment clarifying or clearing up misconceptions

Not really, because I didn't make that claim, so you'll have to ask someone who makes that claim.

s.gif
> Not really, because I didn't make that claim, so you'll have to ask someone who makes that claim.

Well, you actually did. You claimed, and I quote, "being right isn’t an excuse to forgo social expectations"

I'm now asking you to explain the role your "social expectations" have on "being right", specifically in the case where someone in a technical forum makes technically wrong claims.

Are you able to shed some light onto this sort of belief?

s.gif
> Well, you actually did. You claimed, and I quote, "being right isn’t an excuse to forgo social expectations"

That is a very different claim than “trying to pass off technically wrong claims in a technical forum should be immune to any informative comment clarifying or clearing up misconceptions”.

It is possible to be right without posting an “informative content clarifying of clearing up” a “technically wrong claim” and it is possible to “informative content clarifying of clearing up” a “technically wrong claim” in a manner which does not disregard social expectations.

Being right neither requires nor excuses being a jerk.

s.gif
The irony of being weirdly combative about the anodyne observation that "being right isn’t an excuse to forgo social expectations".
s.gif
You can be right without the other person being wrong if you aren't addressing something they said.
s.gif
> (...) to come in and put-down someone's Show HN project.

There was no put-down. There was a very informative and insightful post explaining that a) unlike the original claim, the project does not use a virtual DOM, b) the technique used is indeed very performant and hard to beat, c) other projects also use it.

You need to go way outside of your way to pretend to feel any sort of outrage over this.

s.gif
The put-down to me was where it said: "it's very easy to...", but failed to explain it beyond a few words. I'm relatively clever, but don't know exactly what they are talking about, and the lack of explanation with a statement of ease gives an implication that we should just know their solution.

That changes the message from an informative comment about alternative approaches into something the could be read as a dismissive rejection.

Now, a link to an article explaining the alternative approach... or even just one or two more explanatory sentences... would not come off that way. And maybe it is easy, but it would be better to just put up the facts, not judgments. Post an explanation and let the reader decide whether or not they think it is easy.

s.gif
> The put-down to me was where it said: "it's very easy to...", but failed to explain it beyond a few words.

What do you mean by "failed to explain it beyond a few words"?

OP stated in no uncertain terms that this approach was followed in lit-html, provided a link to a page from lit's site where this approach is thoroughly explained, and if you really want to look closely at real-world implementations you already have the link to lit-html.

How much more do you want to demand from someone in order to point out in a web forum that someone made a mistake?

Also, what stops anyone from posting any question asking for ay clarification?

Or are we supposed to jump right onto the "I'm being persecuted" mode?

s.gif
I find it comical that you make this comment when your previous 3 comments you've done exactly what you're accusing OP of.
s.gif
> It's a very simple approach and very, very hard to beat in the fast/small/simple/buildless tradeoff space.

Author of the ivi library here. Completely agree with an idea that such approach could lead to a better performance, but there is a huge difference between an idea and actual implementation. Also, I just don't get it why a lot developers that work in this problem space still think like "virtual DOM" API and tagged template APIs are mutually exclusive, I've actually have an experimental implementation that supports both APIs and it is not so easy to beat efficient full diff vdom algo. Tagged template APIs are useful when we are working with mostly static HTML chunks, but when it comes to building a set of reusable components (not expensive web components), pretty much everything inside this components becomes dynamic and we are back to diffing everything.

s.gif
Just to add on here, the Preact author made a generalized utility for tagged template => vnode conversions: https://github.com/developit/htm
s.gif
I love the work done by the Lit team (I assume you're a contributor?). It's really fantastically designed, as you mentioned with bundle size/rendering speed/etc. I'm sure that Lit's implementation is very efficient and ranks high in benchmarks.

This isn't to say virtual DOM isn't fast Experimental libraries that use virtual DOM's like blockdom and ivi (see https://krausest.github.io/js-framework-benchmark/2022/table...) are very, very fast.

At the end of the day, the way libraries render UI is a set of tradeoffs. No one method is objectively better. While lit works great for a lot of web developers, so do virtual DOM based libraries

Totally agree on native DOM diffing, I'll check out Apple's proposal :)

s.gif
Virtual DOM is an unnecessary overhead, is what the parent is saying.

There is probably a good reason popular frameworks insist on using it though.

s.gif
I would guess:

- historically, manipulating the DOM directly was slow (in WebKit?) so working on a virtual one made sense

- The idea writing a compiler like Svelte which does the heavy lifting at compile time was not there, or was dismissed for some reason (the React developers might have decided that having a reactive model like Svelte, with the need to tweak JS's semantic a bit - where assigning variables trigger stuff - was not great, or they didn't want this JS/HTML separation)

And then you are stuck with your model for compatibility reasons. React cannot get rid of its virtual DOM without breaking everyone.

s.gif
The VDOM library will still need to manipulate the real DOM sooner or later, so the proposed performance boost of this abstraction is not relevant for browsers.

Since the VDOM runs fast in Node, you can execute your tests quickly in Jest. But since no real browser are running on Node, the value of this is perhaps questionable.

The browser can get a performance boost via serverside rendering and again it comes in handy that the VDOM runs fast in Node. But perhaps this solves a problem that the VDOM has caused, because React loads slowly and renders the slowest [1].

You can run the VDOM in Android and on iOS via React Native and this is all well, but the VDOM is holding the web back because we have come to expect all this from technologies such as Web Components that might load fast and render fast by virtue of not relying on it.

The virtual DOM is modelling the DOM, but the DOM comes with a model out of the box, it's called the Document Object Model and it is always in sync with the view without any constant performance tweaks. Asynchronous rendering can fix it until batched rendering solves it for good. But these are opposite rendering strategies! We are simply going in circles now.

A lot of the myths around the virtual DOM can be explained by unwillingness to learn the native API and we are mostly dealing with the fallout now. It's the same with CSS. Sorry for the rant and the abuse of your comment.

[1] https://twitter.com/championswimmer/status/14865018568345845...

s.gif
> The VDOM library will still need to manipulate the real DOM sooner or later, so the proposed performance boost of this abstraction is not relevant for browsers

VDOM is to DOM as Emacs buffer is to terminal display.

Updating the terminal display was historically slow, so there's an algorithm inside Emacs to take the buffer state, diff it against the previous state, and compile a list of terminal commands (escape sequences) that represent a minimal transition between the two states. Famously, it was marked with an ASCII art skull and crossbones in the comments of the source code.

The VDOM is there for the same reason: provide a fast way to minimize the cost of slow DOM transitions.

s.gif
It sounds like they would repaint the entire terminal on every single change, but that is not what the alternative to VDOM is. They could have skipped straight to the list of terminal commands without the overhead of diffing for an even faster result and that's of course not as easy as it sounds, but that is more or less what lit-html attempts to do with the strategy outlined on https://dev.to/thisdotmedia/lit-html-rendering-implementatio.... This of course comes with a different overhead, so the option to change a `style` property directly should still be considered an option that beats all these strategies if you are doing it sixty frames per second. The problem with VDOM is not with the technology, but with the marketing that has made us believe that such an approach is problematic and slow.
s.gif
The real DOM is always manipulated. Given a state change the entire virtual DOM is generated, then diffed with the real DOM and then changed parts are put into the real DOM (= real DOM is manipulated). What I wonder is whether the reason for virtual DOM is really just historic, is there anything else that has caused its persistence other than inertia?
s.gif
> then diffed with the real DOM

Diffing with real DOM is slow, majority of vdom libraries aren't diffing with real DOM. As an author of a "vdom" library, I don't like to think about "reconciler" as a diffing algorithm because it is a useless constraint, I like to think about it as a some kind of a VM that uses different heuristics to map state to different operations represented as a tree data structure.

> What I wonder is whether the reason for virtual DOM is really just historic, is there anything else that has caused its persistence other than inertia?

As a thought experiment try to imagine how would you implement such features:

- Declarative and simple API

- Stateful components with basic lifecycle like `onDispose()`

- Context API

- Components that can render multiple root DOM nodes or DOMless components

- Inside out rendering or at least inside out DOM mounting

- Conditional rendering/dynamic lists/fragments without marker DOM nodes

Here are just some basics that you will need to consider when building a full-featured and performant web UI library. I think that you are gonna be surprised by how many libraries that make a lot of claims about their performance or that "vdom is a pure overhead" are actually really bad when it comes to dealing with complex use cases.

I am not saying that "vdom" approach is the only efficient way to solve all this problems, or every "vdom" library is performant(majority of vdom libraries are also really bad with complex use cases), but it is not as simple as it looks :)

s.gif
> The real DOM is always manipulated. Given a state change the entire virtual DOM is generated, then diffed with the real DOM and then changed parts are put into the real DOM (= real DOM is manipulated)

That's true, but you need to read a lot from the (V)DOM when diff'ing. Which was said to be slow with the read DOM. I don't know to which extent, and I've read it's not true anymore.

I don't think the diff'ing is done with the real DOM, but between two VDOMs. No?

Anyway, I personally find this approach heavy and like more how Svelte patches the DOM instead of computing a diff.

s.gif
We rolled an in-house/intranet framework and went a slightly different path. Our framework uses pre-built components that are served to the client in terms of js commands over websocket (similar to blazor server-side mode of operation).

Every component in the DOM has an ID and a hash. Client-side events that mutate state automatically modify this hash as well. The server keeps a dictionay of active components per client. At view state sync time, components are first created and removed on an identity basis. Then, all existing components have their hashes compared for equality. Depending on the type, various patch commands will be submitted to the client to realign the element to expected state.

In most cases, each component involves multiple DOM elements. By scanning for the component root elements via attributes, we can avoid having to walk through the entire literal DOM each time. This may have profound consequences for table views and other enumerables.

I have zero clue if this is the fastest/best, but its simple as hell and looks starts to look like butter as you polish each standardized component.

Only real downside is the latency constraint, but this is something we can pretty easily overcome for our users with some well-placed frontend VMs. Definitely wouldn't do something like this for Netflix scale, unless someone told me the economics of 1 websocket per client works now... On average, how many DAUs per VM does Netflix run these days?

s.gif
[Haskell fan boy here]

I feel like there is at least one Haskell library out there that automates this "no virtual DOM" problem. In Haskell it is not uncommon to create data structures on the fly, only to deconstruct them immediately again. If done correctly the compiler can remove the immediate structure completely, leaving a (recursive) algorithm.

An example of this would be sorting via binary trees. If done correctly, the intermediate tree is never (completely) present in ram.

s.gif
Check Surplus. This is exactly how it's designed, and as such it typically tops out the performance charts right next to vanilla JS.
s.gif
I love seeing a bunch of people I know in a random HN post.

:wave:

s.gif
I think this is what solid.js is doing. Except the template literals, they still use jsx.
s.gif
The jsx is there primarily for better typescript support and react-like api.

Internally it compiles down to template literals, so the same advantages apply.

The typescript support for lit is lagging to say the least. It is surprising that there is still no good official support for type-checking of templates. There are a few community projects though but they are not as reliable and the DOM/Web Component API doesn't make it easy to make fully type-safe APIs.

Solid being able to lean in the JSX support in the TS compiler and not tying itself to custom elements api is able to offer a much better DX here.

s.gif
No, solid.js is building reactive graph at runtime and in theory should be able to also detect static inputs at runtime (not sure how much effort he put into reactive graph optimization techniques). Personally, nowadays I prefer S.js/solid.js approach, but it has different tradeoffs, like it is essential to understand the difference between solid.js and React/Svelte/lit/etc :)
s.gif
Your link shows an empty page with some "e is undefined" error in the console (Firefox).
Is React rendering performance really a pain point worth solving? I've never thought to myself "man I wish React would render faster"... It doesn't seem like this is whatsoever a bottleneck for our application (~1k unique components).

I feel like the main pain point with React is that routing, bundling, SSR, state management, etc. have to be painfully stapled together, and this is what frameworks like Next.js solve for.

s.gif
Don't think it's a pain point, but better performance without feeling a difference (ie not using bubble sort for sorting but using quicksort) is just.. always nice?
Congrats!

Considering the pretty docs, I'm guessing you're trying to get people to use this. It would be useful to have the value proposition front-and-center (e.g. in your README).

If someone is in the React mainstream, they use React. If they like the devex of React but want something simpler/more streamlined, they use Preact. I'd appreciate a "This is why you might choose this library instead of Preact. This is why this is a new library instead of a patch to Preact. This is an honest assessment of when React/Preact is a better choice than Million." section.

There are often tradeoffs in library development. Being honest about the ones you chose helps other developers trust you.

s.gif
Biggest thing for me is that this library won't get abandoned like the ones that the intro section for this library refers to. Unless the value prop is really, really good I would stick to the bigger options for that reason alone.
s.gif
Even better, for something similar to React, but much simpler and more streamlined, I turned to Svelte. As a backend dev, it was the first frontend framework that I actually stuck with.

In retrospect, I think what really confused me was that React components return HTML, which is a weird semantic I can't intuit, after a few attempts. Svelte lets you declare everything separately, while being more succinct and readable (IMO at least).

s.gif
Probably not too relevant/helpful anymore, but the problem might be that React components don't return HTML: they're functions that return a collection of JS objects that describe what the DOM ("HTML") should look like, and React compares that to what the DOM actually looks like and applies the relevant changes. These functions get called every time something changes. Having that mental model is vital for React "clicking".
s.gif
I appreciate the feedback. Will be adding to the readme!
OP, are you aware that the React team are testing their own compiler for React? Not to say your project is obselete, just want to make you aware. The team talked about it at React Conf 2021 (see youtube videos), currently named React Forget.
Seems like https://millionjs.org/benchmarks is hanging (tested on firefox with 4GB RAM machine) and the page shoot me 4.8 GB of memory.
s.gif
On that note, the main page https://millionjs.org/ is weirdly janky/slow when scrolling. Doesn't make sense to me since there's barely much on it.
there's a little error in the demo, line 6:
  const [value, setValue] = useState(0);
should be:
  const [value, setValue] = useState(init);
or the parameter init could be removed from the component.
Hope you're looking at tools like RiotJS and SvelteJS as well. Riot specifically has a compile step (but you can run un-compiled while building)
s.gif
Definitely going to check Riot out. I've tried it while it was in the initial release, seems really cool they're trending towards compiled!
s.gif
Riot has its quirks, but it should really get more love than it does. I found it to be a breeze to work with, and they were doing things correctly even back when the pre-rewrite versions of Angular and React were still getting it very wrong.
s.gif
I've been doing a small project in SolidJS recently and really enjoying it. It is a lot easier to reason about than React. My only complaint is that the router is kind of alpha and a moving target, so the examples are now out of date with the latest version. Also, like most open source projects, the documentation is lacking in a lot of ways. That said, if you know React and you're not doing anything too complicated, I highly recommend it.
s.gif
Agree wrt. Solid being easier to reason about.

Is the router you are using solid-app-router [1] ? Have been working with it for last few months and it has been generally stable (my usecases are not particularly complex though).

The docs for the solidjs core has also massively improved recently.

[1] https://github.com/solidjs/solid-app-router

What is its difference with Preact?
I hope developers with decent skills move on on Hook style api (I've seen at least 3 better-than-react libs still copy this) I'm 100% sure it's not going to last long, someone will come up with a better api very soon.
s.gif
The only major downside with hooks is that they are sort of non-intuitive imo. Writing UIs with hooks once you grok it properly is really nice and terse. What's your beef with them?
s.gif
> I hope developers with decent skills move on on Hook style api (...)

What's your opinion on Hook-style APIs?


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK