(no title)
Danborg | 2 months ago
If a project is important enough to require C or x86 assembly, where memory management and undefined behavior have real consequences, then it’s important enough to warrant a real developer who understands every line. It shouldn’t be vibe coded at all.
Python’s “adorable concern for human problems” isn’t a bug here, it’s a feature. The garbage collection, the forgiving syntax, the interpreted nature: these create a sandbox where vibe coded solutions can fail safely. A buggy Python script throws an exception. A buggy C program gives you memory corruption or security holes that show up three deployments later.
The question isn’t what language should AI write. It’s what problems should we trust to vibe coding. The answer: problems where Python’s safety net is enough. The moment you need C’s performance or assembly’s precision, you’ve crossed into territory that demands human accountability.
mentos|2 months ago
If not then I see the argument for everything being done in Python and performance coming from optimizing Python -> C.
wongarsu|2 months ago
I more bullish on the Python -> Rust pipeline. The two languages have a lot of overlap in philosophies, have great interop, and have similar levels of guard rails (when it comes to multithreading Rust even beats Python in terms of safety). And both languages seem well suited to being vibecoded
wongarsu|2 months ago
simlevesque|2 months ago
Thorrez|2 months ago
If you want a language that protects you from the largest amount of problems, how about Rust? Vulnerabilities will still be possible, but at least data races won't be possible.
sfdlkj3jk342a|2 months ago
I disagree. I write a lot of one-off numerical simulations where something quick and dirty is useful but performance matters and the results can be easily verified without analyzing every line of code. Python would be a terrible choice.
glouwbug|2 months ago
Maxion|2 months ago
A poorly written comment by a human wastes time. A vibe comment by an LLM wastes both time and electricity that only shows up when global warming reaches 3c.
The question isn't if the comment is valuable or not. It's whether it is ethical or not to waste peoples time with AI slop.
This is chatGPTs pattern.
chamomeal|2 months ago
Edit: but I empathize with the paranoia of everything being AI slop! I’m constantly scrutinizing stuff and it’s annoying
MattRix|2 months ago
It doesn’t feel like we’re very far from that point.