top | item 34748917

(no title)

blacha | 3 years ago

I don't think geojson is a great format for anything with more than a few MB of data.

I wanted to see exactly how bad it is with a largeish datasets, so I exported the New Zealand address dataset[1] with ~2.5M points as a geopackage (750MB) QGIS loads this fine its a little slow when viewing the entire country but when zoomed into city level it is almost instant to pan around.

Using ogr2ogr I converted it to ndgeojson (2.5GB), It crashed my QGIS while trying to load it. Using shuf I created a random 100,000 points geojson (~110MB) it was unbearably slow in QGIS while panning around 5+ seconds.

I currently use and recommend flatgeobuf[2] for most of my working datasets as it is super quick and doesn't need sqlite to read (eg in a browser).

It is also super easy to convert to/from with ogr2ogr

ogr2ogr -f flatgeobuf output.fgb input.geojson

[1] https://data.linz.govt.nz/layer/105689-nz-addresses/data/ [2] https://github.com/flatgeobuf/flatgeobuf

discuss

order

moomoo11|3 years ago

Can’t you use clustering techniques?