biwills's comments

biwills | 9 months ago | on: Figma files for proposed IPO

I disagree, many companies are still great even after going public in the last decade:

Shopify, Cloudflare, Zoom, Spotify, Roblox, and Coinbase are all notable examples.

biwills | 1 year ago | on: Why Every Programming Language Sucks at Error Handling

The biggest problem I see is that, like static/dynamic typing, it's usually a boil-the-ocean problem. Most languages have historically been static or dynamically typed. Only recently have TypeScript and Python allowed for migration from dynamic to static typing, introducing millions(?) of developers to static types in the process.

With errors, it's hard since many languages can throw errors anywhere, so it's hard to feel like any function is "safe" in terms of error handling. That's one of the reasons why `enwrap` returns a generic error alongside any other result: to support incremental adoption.

If you have a chance to check out `enwrap` and have feedback, email me! (link in bio)

biwills | 1 year ago | on: Why Every Programming Language Sucks at Error Handling

The more I write software, the more I think errors should be first-class citizens (camp #2 from the OP's post).

I've been using https://github.com/biw/enwrap (disclaimer: I wrote it) in TypeScript and have found that the overhead it adds is well worth the safety it adds when handling and returning errors to users.

That said, I see parallels between the debate about typed vs. non-typed errors and the debate of static typing vs. dynamic typing in programming languages.

biwills | 1 year ago | on: Text Editing Hates You Too (2019)

I'd say it's less about covering all edge cases and more about showing that text editors are insanely complex, and it doesn't take long to find edge cases that text editors with 10s/100s of millions of users have.

biwills | 2 years ago | on: Faraday.dev – Connect your phone to LLMs running on your desktop

Hey HN,

Excited to share Mobile Tethering, our latest feature release on Faraday.dev. It lets you run local LLMs on your Mac/Windows Computer (Linux soon) and seamlessly use them to chat with AI on mobile. Since all the heavy workloads run directly on your computer (instead of on an expensive cloud server), it's 100% free to use, and your chat data is never stored or logged in the cloud.

I'm one of the founders of Faraday.dev, so would love to hear any ideas you have on what we should build next!

__

PS: For those who've never used Faraday – it's a zero-config desktop app for creating AI characters (custom chatbots) powered by locally running LLMs. Faraday can run on CPU with only 8GB of RAM via llama.cpp by @ggerganov, and the app will automatically use your GPU to speed things up. We also have a community-driven Character Hub, text-to-speech, lorebooks, and more.

page 1