killcoder | 1 month ago | on: The missed opportunity of constrained decoding
killcoder's comments
killcoder | 1 month ago | on: The missed opportunity of constrained decoding
I had a bit too much fun with the tokenisation diagrams / animations. The raw text is provided to an Astro component, which tokenises it, and forms the individual DOM elements of the tokens. I find it really hard to read 'tokenised' text, I figured some consistent colouring would help. The 'Probabilities' component is a trivial grid, but all the other components support 'word wrap'.
I ended up writing a 'responsive design aware graph colouring solver'.
Multiple screen widths, 'desktop' and 'mobile' are 'simulated', forming an adjacency graph of tokens that touch. Colours are then greedily allocated, then optimised per page over a few hundred iterations, swapping allocations to enforce minimum hue distance between touching tokens at those common screen sizes. The optimising value function prioritises even distribution of colours, because it looks nicer than maximal hue difference.
Originally I naively outputted the palette styles per component, but found the css post processing optimisers didn't handle that as well as I'd have thought. So then I wrote a little 'CSS compiler' that takes the high level palette and timing concepts of the animations, and optimally merges rule declarations.
The start of the post really relies on the animation occurring while fully in view, so I set up some IntersectionObservers that do the 'please scroll' text.
I tried my best to have it all work when JS is disabled on the client. I tried to get the 'hovering' to be CSS-only, but found the JS solution much more performant.
The DAG diagrams are formed with this neat Needleman-Wunsch algorithm from the bioinformatics field. The Astro component accepts several 'examples' then aligns common subsequences, producing the CSS grid and the 'basic SVG' on the server. The responsive nature meant I had to move the final 'allow' generation to the client.
Some browsers seem to throttle the token animations sometimes but I haven't figured out what causes that. This is my first time leaning hard on CSS variables.
killcoder | 3 months ago | on: $96M AUD revamp of Bom website bombs out on launch
The joint state and federal government relief and cleanup package is worth AUD $102.5 million dollars.
I hope the public receives that comparison at every opportunity.
The old website was frankly excellent, the only problem was it didn't have HTTPS support. I would have happily upgraded that part of the system for the cost of a cup of coffee if I'd had an opportunity to submit for the tender!
The new website is significantly more difficult to navigate (for me, a seasoned tech user). The primary thing Dad's everywhere use it for (the weather radar) now requires scrolling to the _bottom_ of the page, and zooming in from the 'map of Australia' to the region you live in. It used to be like, a click to go from home page -> state weather radar with all the info you needed.
https://www.abc.net.au/news/2025-11-23/bureau-of-meteorology...
If you want to read our local news about it.
> [BOM] said the cost breakdown included $4.1 million for the redesign, $79.8 million for the website build, and the site's launch and security testing cost $12.6 million.
Absolutely stupid, even those numbers are outrageous. They say it's part of some 'larger upgrade package', prompted by a cyber attack in 2015.
https://www.abc.net.au/news/2015-12-02/china-blamed-for-cybe...
But politicians over here love to blame cyber attacks when technical blunders happen. We had a census a couple years ago and the website fell over due to 'unprecedented load' or maybe it was a 'DDOS attack'? The news at the time couldn't decide who to blame!
Welp, I hope this gets as much world-wide attention as possible so they can be embarrassed and do better.
killcoder | 3 months ago | on: Building more with GPT-5.1-Codex-Max
But they're claiming it's more token efficient, so me switching my usage to the new model should _free up_ capacity.
killcoder | 3 months ago | on: Building more with GPT-5.1-Codex-Max
killcoder | 4 months ago | on: GPT-OSS-Safeguard
killcoder | 1 year ago | on: Show HN: Electrico – Electron Without Node and Chrome
The main process really shouldn’t be used for anything except setup. Since it controls gpu paints amongst other things, blocking on it will cause visible stuttering and a bad user experience.
https://www.electronjs.org/blog/electron-internals-node-inte...
killcoder | 1 year ago | on: Show HN: Electrico – Electron Without Node and Chrome
killcoder | 1 year ago | on: Show HN: Electrico – Electron Without Node and Chrome
https://www.electronjs.org/docs/latest/api/structures/web-pr...
Separately the IPC lets you do zero copy in some circumstances via Transferable objects such as ArrayBuffers. Structured cloning is efficient but not zero copy, and json serialisation shouldn’t be used (since structured cloning is easily available).
killcoder | 1 year ago | on: Show HN: Electrico – Electron Without Node and Chrome
Source: https://www.electronjs.org/docs/latest/api/structures/web-pr...
killcoder | 1 year ago | on: Show HN: Electrico – Electron Without Node and Chrome
The other main difference is Electron bundles a known set of APIs, given the known Chromium version. There’s such a huge variance of supported features across the embedded web views.
killcoder | 1 year ago | on: Hi-Tech Bifocals Improved My Eyesight but Made Me Look Like a Dork
killcoder | 1 year ago | on: Serious Sam handled massive amounts of enemies on 56k modem connections
killcoder | 1 year ago | on: DuckDB 1.0.0
killcoder | 1 year ago | on: Sony overturns Helldivers 2 PSN requirement following backlash
killcoder | 2 years ago | on: Benchmarking latency across common wireless links for microcontrollers
We’ll probably do a series of power consumption / range tests later on, let me know if there are any setups in particular that you’d be interested in seeing test cases for.
Raw data, firmware and post processing scripts are here on GitHub:
https://github.com/Scottapotamas/embedded-wireless-latency-e...
killcoder | 2 years ago | on: I built my own 16-Bit CPU in Excel [video]
I think there has been evolution in the underlying data computation side of things, but there are still unsolved questions about 'visibility' of graphical node based approaches. A node based editor is easy to write with, hard to read with.
killcoder | 2 years ago | on: Twoslash: Markup for generating rich type information in documentation
https://electricui.com/docs/components/LineChart
https://electricui.com/docs/operators/aggregations
Our product, Electric UI, is a series of tools for building user interfaces for hardware devices on desktop. It has a DataFlow streaming computation engine for data processing which leans heavily on TypeScript's generics. It's pretty awesome to be able to have examples in our docs that correctly show the types as they flow through the system. I certainly learn tools faster when they have good autocomplete in the IDE. Twoslash helps bring part of that experience earlier in the development process, right to when you're looking at documentation.
Our site is built with GatsbyJS, the docs are a series of MDX files rendered statically, then served via Cloudflare Pages. We use the remark plugins to statically render the syntax highlighting and hover tag information, then some client-side React to display the right tooltips on hover.
We build a Twoslash environment from a tagged commit of our built TypeScript definitions, from the perspective of our default template. The Twoslash snippets as a result have all the required context built in, given they are actual compiled pieces of code. The imports we display in the docs are the actual imports used when compiling from the perspective of a user. It bothers me when docs only give you some snippet of a deeply nested structure, and you don't know where to put it. Even worse when it's untyped JS! Using Twoslash lets us avoid that kind of thing systematically.
The CI system throws errors when our docs snippets "don't compile", which is very helpful in keeping our docs up to date with our product. Nothing worse than incorrect docs!
We use React components extensively, and I'm not really happy with our prop reference tables which use Palintir's Documentalist. Our components are increasingly using complex TypeScript generics to represent behaviour. The benefits in the IDE are huge, but the relatively dumb display of type information in the prop table leaves something to be desired. I'm most likely going to replace the data for those tables with compile-time generated Twoslash queries.
My only complaints have been around absolute speed of compilation, but I haven't dug deep into improving that. I just set up a per snippet caching layer, and once files are cached, individual changes are refreshed quickly. After all, it's invoking the full TypeScript compiler, and that's its biggest feature.
Overall I've been very happy with Twoslash, and I'm looking forward to refactoring to use this successor and Shikiji (the ESM successor to Shiki), hopefully it improves our performance. The new Twoslash docs are look great, a huge improvement on when we started using it.
killcoder | 2 years ago | on: Mir: Strongly typed IR to implement fast and lightweight interpreters and JITs
killcoder | 2 years ago | on: Branchless Lomuto Partitioning
https://github.com/Voultapher/sort-research-rs/blob/main/wri...
Discussion here:
https://news.ycombinator.com/item?id=38528452
This post by orlp (creator of Pattern-defeating Quicksort and Glidesort) was linked to in the above post, and I found both to be interesting.