(no title)
impendia | 2 months ago
The AI summary at the top was surprisingly good! Of course, the AI isn't doing anything original; instead, it created a summary of whatever written material is already out there. Which is exactly what I wanted.
impendia | 2 months ago
The AI summary at the top was surprisingly good! Of course, the AI isn't doing anything original; instead, it created a summary of whatever written material is already out there. Which is exactly what I wanted.
Arisaka1|2 months ago
This isn't strictly a case against AI, just a case that we have a contradiction on the definition of "well informed". We value over-consumption, to the point where we see learning 3 things in 5 minutes as better than learning 1 thing in 5 minutes, even if that means being fully unable to defend or counterpoint what we just read.
I'm speficially referring to what you said: "the speaker used some obscure technical terminology I didn't know" this is due to lack of assumed background knowledge, which makes it hard to verify a summary on your own.
gosub100|2 months ago
So someone who wants a war or wants Tweedledum to get more votes than Tweedledee has incentives to poison the well and disseminate fake content that makes it into the training set. Then there's a whole department of "safety" that has to manually untrain it to not be politically incorrect, racist etc. Because the whole thesis is don't think for yourself, let the AI think for you.
impendia|2 months ago
The same is true if I imagined there would be misleading bullshit out there. In this case, it's hard to imagine that any nonexpert would bother writing about the topic. ("Universal torsor method" in case you're curious.)
I skimmed the AI summary in ten seconds, gained a rough idea of what the speaker was referring to, and then went back to following the lecture.
auntienomen|2 months ago
Looking this sort of thing up on the fly in lecture is a great use for LLMs. You'll lose track of the lecture if you go off to find the definition in a reference text. And you can check your understanding against the material discussed in the lecture.
lazide|2 months ago
The 3 things in 5 minutes is even worse - it’s like taking Google Maps everywhere without even thinking about how to get from point A to point B - the odds of knowing anything at all from that are near zero.
And since it summarizes the original content, it’s an even bigger issue - we never even have contact with the thing we’re putatively learning from, so it’s even harder to tell bullshit from reality.
It’s like we never even drove the directions Google Maps was giving us.
We’re going to end up with a huge number of extremely disconnected and useless people, who all absolutely insist they know things and can do stuff. :s
sam_goody|2 months ago
I looked up a medical term, that is frequently misused (eg. "retarded"), and asked the Gemini to compare it with similar conditions.
Because I have enough of a background in the subject matter, I could tell what it had construed by its mixing the many incorrect references with the much fewer correct references in the training data.
I asked it for sources, and it failed to provide anything useful. But once I am looking at sources, I would be MUCH better off searching and only reading the sources might actually be useful.
I was sitting with a medical professional at the time (who is not also a programmer) and he completely swallowed what Gemini was feeding him. He commented that he appreciates that these summaries let him know when he is not up to date with the latest advances, and he learnt alot from the response.
As an aside, I am not sure I appreciate that Google's profile would now associate me with that particular condition.
Scary!
kenjackson|2 months ago
solumunus|2 months ago
FridayoLeary|2 months ago
rsynnott|2 months ago
It's true. I previously had no idea of the proper number of rocks to eat, but thanks to a notorious summary (https://www.bbc.com/news/articles/cd11gzejgz4o) I have all the rock-eating knowledge I need.
jennyholzer2|2 months ago
If you ask Google about news, world history, pop culture, current events, places of interest, etc., it will lie to you frequently and confidently. In these cases, the "low quality summary" is very often a completely idiotic and inane fabrication.
skdhshdd|2 months ago
[deleted]