(no title)
wtatum | 3 years ago
I also see that load-db.js is loading the known search index values into localStorage ... do you have some tooling for creating this file from a known SQLite base file or was it just handrolled from a localStorage already holding a valid DB?
EDIT: Looking in the code I found this const db = new sqlite3.oo1.DB({ filename: 'local', // The 't' flag enables tracing. flags: 'r', vfs: 'kvvfs' });
Googling lead me to the "official" SQLite WASM pages which otherwise don't appear too prominently in search results for whatever reason. This (https://sqlite.org/wasm/doc/trunk/about.md) seems like a good starting point and notes that both sql.js and absurd-sql are in fact inspirations
zainab-ali|3 years ago
The process for building the database is a bit complex. I want to support all browsers, so unfortunately need to use local storage to back it up. Firefox has a while to go before it supports the Origin Private File System, but once it does so the build will be a lot smoother.
I build the index as part of the site’s CI (using nix), by running SQLite-WASM in deno to pre-load local storage. I then extract the keys from local storage and populate them as part of the site load using the hand-rolled load-db.js file.
SQLite WASM does have better support for importing / exporting databases on OPFS, so this process should be simpler as soon as I can move to it.
I’ll write a follow up post at some point on the implementation details.
wtatum|3 years ago
I have a current project that is starting out by building a client-side SQLite DB (currently in Tauri but WASM SQLite would be an option as well) that would benefit from an option to offload building really large DBs so a shared server process. Knowing same or very similar code could run in Deno opens up some really interesting possibilities.