top | item 43991360

(no title)

turbocon | 9 months ago

Wow, what a nightmare of a non-deterministic bug introducing library.

Super fun idea though, I love the concept. But I’m getting the chills imagining the havoc this could cause

discuss

order

anilakar|9 months ago

Didn't someone back in the day write a library that let you import an arbitrary Python function from Github by name only? It obviously was meant as a joke, but with AIcolytes everywhere you can't really tell anymore...

atoav|9 months ago

Why not go further? Just expose a shell to the internet and let them do the coding work for you /s

rollcat|9 months ago

Flask also started as an April 1st joke, in response to bottle.py but ever so slightly more sane. It gathered so much positive response, that mitsuhiko basically had to make it into a real thing, and later regretted the API choices (like global variables proxying per-request objects).

tilne|9 months ago

Is there somewhere I can read about those regrets?

userbinator|9 months ago

It's like automatically copy-pasting code from StackOverflow, taken to the next level.

extraduder_ire|9 months ago

Are there any stable output large language models? Like stablediffusion does for image diffusion models.

tibbar|9 months ago

If you use a deterministic sampling strategy for the next token (e.g., always output the token with the highest probability) then a traditional LLM should be deterministic on the same hardware/software stack.

3abiton|9 months ago

Sounds like a fun way to learn effective debugging.

emporas|9 months ago

It imports the bugs as well. No human involvement needed. Automagically.

dheera|9 months ago

I mean, we're at the very early stages of code generation.

Like self-driving cars and human drivers, there will be a point in the future when LLM-generated code is less buggy than human-generated code.