top | item 25930874

(no title)

philplckthun | 5 years ago

Not to toot our own horn, but while this mentions GraphQL with Relay / Apollo as fetching clients, with urql and its normalised cache Graphcache we started approaching more of these problems.

Solving the mentioned problems in the article for best practices around fragments is on our near future roadmap, but there are some other points here that we've worked on that especially Apollo did not (yet)

Request batching is in my humble opinion not quite needed with GraphQL and especially with HTTP/2 and edge caching via persisted queries, however we have stronger guarantees around commutative application of responses from the server.

We also have optimistic updates and a lot of intuitive safe guards around how these and other updates are applied to all normalised data. They're applied in a pre-determined order and optimistic updates are applied in such a way that the optimistic/temporary data can never be mixed with "permanent" data in the cache. It also prevents races by queueing up queries that would otherwise overwrite optimistic data accidentally and defers them up until all optimistic updates are completed, which will all settle in a single batch, rather than one by one.

I find this article really interesting since it seems to summarise a lot of the efforts that we've also identified as "weaknesses" in normalised caching and GraphQL data fetching, and common problems that come up during development with data fetching clients that aren't aware of these issues.

Together with React and their (still experimental / upcoming) Suspense API it's actually rather easy to build consistent loading experiences as well. The same goes for Vue 3's Suspense boundaries too.

Edit: Also this all being said, in most cases Relay actually also does a great job on most of the criticism that the author lays out here, so if the only complaint that a reader here picks up are the DX around fragments and nothing else applies this once again shows how solid Relay can be as well.

discuss

order

No comments yet.