rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
rabisg's comments
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
The primary motivation was speed and schema cohesion. We were running a JSON based format, Thesys C1, in production for a year before we realized we cannot add features fast enough because we were fighting the LLMs at multiple levels. It's probably too much to write in a comment but we'd like to write about the motivation and all the things we tried ona a separate blog soon
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
This is an alternative to json-render by Vercel or A2UI by Google which I'm guessing the flutter implementation is based on
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
I understand your frustration with AI writing though. We are a small team and given our roadmap it was either use LLMs to help collate all the internal benchmark results file into a blog or never write it so we chose the former. This was a genuinely surprising and counterintuitive result for us, which is why we wanted to share it. Happy to clarify any of the numbers if helpful.
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
The most obvious approach would be to let LLMs generate code and render it but that introduces problems like safety, UI consistency and speed. OpenUI solves those problems and provides a safe, consistent and token optimized runtime for the LLMs to render live UI
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
rabisg | 8 days ago | on: We rewrote our Rust WASM parser in TypeScript and it got faster
But yes I see what you mean and I think people are trying to solve it with skills and harnesses at the application layer but its not there yet