(no title)
Grothendank | 2 years ago
It works astoundingly well with poorly written technical manuals. Looking at you, CMake reference manual O_O. It also helps translate unix man pages from Neckbeardese into clean and modern speech.
With science papers it's a bit more work. You must copy section by section into GPT4, despite the increased token limit.
But sure. Here's how it can work:
1. Copy relevant sections of the paper
2. As questions about the jargon:
"Explain ____ like I'm 5. What is ____ useful for? Why do we even need it?"
"Ah, now I understand _____. But I'm still confused about _____. Why do you mean when you say _____?"
"I'm starting to get it. One final question. What does it mean when ______?"
"I am now enlightened. Please lay down a sick beat and perform the Understanding Dance with me. Dances"
This actually works surprisingly well.
mszcz|2 years ago
What you get is a teacher that never tires, is infinitely patient, has infinite time, doesn't limit questions, doesn't judge you, really listens and has broad, multidisciplinary knowledge that correct-ish (for when it's needed). I've recently read somewhere that Stanford (?) has almost as many admin workers as they do students. Seems to me that this is a really bad time to be that bloated. Makes you wonder what you really spend your money on, is it worth it (yeah, I know, it's not just education that you get in return) and if you can get the same-ish effect for a lot cheaper and on your timetable.
Not that the models or field, now, are in a state that would produce a good teaching experience. I can however imagine a future not so distant that this would be possible. Recently on a whim I've asked it to produce a options trading curriculum for me. It did a wonderful job. I wouldn't trust it if I didn't know a little bit myself about the subject before but I came off really impressed.