top | item 42848874

(no title)

frogsRnice | 1 year ago

As opposed to wondering if the llm is hallucinating?

You have to expend a mental effort to think about your solutions anyway; I guess it’s pick your poison really.

discuss

order

zwnow|1 year ago

Thats the issue, people just copy and paste code from llms thinking "yeah, looks fine to me". It might be a skill issue, but personally it takes me a while to understand the code its giving me and even more on how to actually implement it with all the edge cases that might happen.

anileated|1 year ago

Before: I’m a lazy developer so I find the best libraries and abstract logic to write the least code and do least maintenance work.

Now: I’m a lazy developer, so I get a glorified autocomplete to write 10x code than what I have the willpower to. Of course, I won’t read all of it.

joseda-hg|1 year ago

Is it important if it's ocasionally hallucinating?

It's not like you should blindly throw the code in, you should run it and verify it

The more common the work you're doing the less likely it is to hallucinate, plus you can ask it to stick to whatever arbitrary coding standards you want so it's more readable to you, a rewrite to remove a wrong library takes an extra couple seconds per method/function

Also it's not like Stack Overflow or other non generated resources don't ocasionally hallucinate, it's not weird for the second or third voted answers in SO to be followed by the comment "This doesnt work because XYZ"

skydhash|1 year ago

That’s why you take a quick glance of the answer, then read the comments. The do a deeper analysis. Take something like 10 sexonds as it seems every real answer I find that’s good is usually just one or two paragraphs.