top | item 44296870

(no title)

olmo23 | 8 months ago

Where does the knowledge come from? People can only post to SO if they've read the code or the documentation. I don't see why LLMs couldn't do that.

discuss

order

nobunaga|8 months ago

ITT: People who think LLMs are AGI and can produce output that the LLM has come up with out of thin air or by doing research. Go speak with someone who is actually an expert in this field how LLMs work and why the training data is so important. Im amazed that people in the CS industry seem to talk like they know everything about a tech after using it but never even writing a line of code for an LLM. Our indsutry is doomed with people like this.

usef-|8 months ago

This isn't about being AGI or not, and it's not "out of thin air".

Modern implementations of LLMs can "do research" by performing searches (whose results are fed into the context), or in many code editors/plugins, the editor will index the project codebase/docs and feed relevant parts into the context.

My guess is they either were using the LLM from a code editor, or one of the many LLMs that do web searches automatically (ie. all of the popular ones).

They are answering non-stackoverflow questions every day, already.

planb|8 months ago

I think the time has come to not mean LLMs when talking about AI. An agent with web access can do so much more and hallucinates way less than "just" the model. We should start seeing the model as a building block of an AI system.

raincole|8 months ago

> LLM has come up with out of thin air

People don't think that. Especially not the commentor you replied to. You're human-hallucinating.

People think LLM are trained on raw documents and code besides StackOverflow. Which is very likely true.