top | item 31747368

(no title)

lo5 | 3 years ago

This has nothing to do with the GoG.

This applies to any charting library that forces you to provide both spec and unaggregated data to memory/cpu constrained clients (e.g. Javascript in the browser). This is done for implementation-simplicity (Vega, for example), but obviously doesn't scale to larger datasets.

I've implemented a system where the data part of the spec is munged in-database, and aggregated data is provided to the browser, along with hints for axes, scales, legends, etc. It requires a part of the GoG interpreter to be resident on the server-side.

discuss

order

infinite8s|3 years ago

That sounds very similar to VizML (the visualization/data processing library underlying Tableau). That has been my big complaint about most visualization libraries - there is no sharing of the underlying data set for multiple projections across the same large data set. Grid/table libraries have the same issues

lo5|3 years ago

Yes. Tableau would have to separate rendering from data select/filter/aggregation, especially because integrating with customer databases live is a key use case. Hence the built-in buffet of connectors/drivers.

It looks like with later versions they switched to kind of a hybrid approach (part-remote, part-local) with Hyper to reduce latency for interactivity.

> there is no sharing of the underlying data set for multiple projections across the same large data set

But that would require some kind of open standard for portability, no?