Seriously. Kids are going to cheat. It's already easy enough to just throw the test material into the LLM and get a bunch of flash cards on relevant content and memorize that. I Wish I had AI in college.
From watching slightly younger than college age kids adapt to the current world, I think you should be glad you did’t have access to LLMs during your learning years.
It’s too easy to slip from the idea that you’re just going to use the LLM to generate study materials into thinking that you’re just going to let the LLM do this homework assignment because your tired and then into a routine where ChatGPT is doing everything because you’ve come to rely on it. Then the students get slapped in the face with a sudden bad grade because the exams are in-person and they got all the way to the end of the semester with A-graded homework despite very little understanding of the material.
> It’s too easy to slip from the idea that you’re just going to use the LLM to generate study materials into thinking that you’re just going to let the LLM do this
This is exactly what people who know better are figuring out with vibe coding.
It’s extremely tempting for me to ask Claude to “do this thing that would take me three hours, but you only seconds”.
Many people are coming around to the realization that while that sometimes does work great, most of the time you ARE going to spend those three hours… you’re just going to spend it fixing, debugging, refactoring, instead of writing to begin with.
I'm in an online degree program in mathematics in my forties and this temptation is very real. The LLMs have memorized every textbook and every exercise so it's easy to have the kinds of conversations that before I could only have with TAs during office hours, and skip the mental struggle.
At least in my most recent class, it's also wrecked the class discussion forums that I previously found very helpful. By the end half the students were just slop-posting entire conceptual explanations and exercises, complete with different terminology, notation, and methods than the class text. So you just skip those and look for the few students you know are actually trying.
The younger generations already struggle with technology because the guts have been hidden away their whole lives. They never had to understand a directory structure or a configuration file just to get a game running.
Having an LLM would turn that up to 11. Wishing you had AI in college is like wishing you had a car to train for a marathon. It’ll help a lot, if you ignore the actual goal of the work.
I don't think it is much different than the fresh grad that you interview that was clearly carried by his classmates in all his group projects.
Most of my professors in college gave boring, monotonous lectures from power point slides. They were simply going through the motions, so likewise I treated the work as a means to an end --a piece of paper to say I did the college thing. I had 3 professors out of the dozens I had that did not fit that mold and I studied hard so as not to make their passion null and void.
A professor's primary job is to instill interest in their students, which AI should not affect. If a student doesn't have interest or passion, whether self-taught and/or instilled, they will be mediocre at best in whatever profession they picked.
I don’t think that’s true. When I was growing up it was a very shameful thing. If it has become as common as you say, maybe we need harsher and more public censure for cheating incidents.
This is a very concerning statement given the implications of your post.
AI can be a tool for learning or a tool for passing. Only one of those things is beneficial for society and it's not the one short minded students in crunch time will, on average, care about.
In order to be a good little cog in the capitalist machine, all you need is passion and interest in the subject you are pursuing. Classes not relevant to your subject (ex. Liberal Arts) are mostly a waste of time for such things, which I would have gladly used AI generated flash cards for.
Memorize the things they want you to learn and move on. It's not like you are going to recall it later on because you don't have a passion or interest for it. The only things I recall in those classes are from professors who had passion in the subject, hence why I now have a weird interest in 1920s American History.
I also wish I had AI in college. I would have used it to descramble the unintelligible utterances of the calculus lecturers who had minimal or no English language skills.
Those poor calculus lecturers are most likely required to teach in order to earn their PHD. It is unfortunate that most students do not get to learn higher level math because of it. I was the type of student who did better when the professor was difficult, but engaging.
For example, I hated English growing up and then I had a college English course with a professor who was absolutely passionate about it and made it fun. Now, I hate English a little less and could appreciate it more. We need more people like that for other subjects.
For the last two decades, YouTube (or better, MIT's OpenCourseWare) has provided instruction that sets a baseline.
I'm positive that college lecturers fall below this baseline, but there's plenty of alternatives that a moderately motivated student could use.
Part of the problem is that the typical ~20 year old student has little idea how to learn something and little opinion about what their education should produce, to guide them.
Using a tool to help you study isn't cheating. Using a tool to take the test for you, without regard to your own skills or knowledge of the subject under test, is.
> It's already easy enough to just throw the test material into the LLM and get a bunch of flash cards on relevant content and memorize that
LLM summarisation is broken, so I wouldn't expect them to get very far with this (see this comment on lobste.rs: https://lobste.rs/c/je7ve5 )
Also, memorizing flashcards is actually, to some point, learning the material. There's a reason why Anki is popular for students.
Ultimately, however, this comes down to the 20th+21st century problem of "students learning only for the test", which we can see has critical problems that are well-known:
Maybe it's different for higher education, but at least for my more memorization-centric high school courses (religion, science, civics), I find that I get good-enough grades by just feeding ChatGPT the test reviews and having it create Anki flashcards, making a few edits[1], and then reviewing them for a few weeks prior to the test on the toilet, bus, before bed, etc. If they're inaccurate, somebody should probably let the test know. So far it's been enough to bring my grades from low to mid 80s to high 90s. Spending an extra hour or two to squeeze out another 1 or 2 percentage points just doesn't seem worth it. I don't personally think that it's cheating, because IMO how I decide to study for the test is of no concern to the teacher, as long as I'm not getting outside help during the test itself[2].
A feeling I've been having a lot recently is that I have no idea why I actually want good grades in school. When I was a kid, I was told that life went:
good grades in high school -> good university -> good job -> lots of money -> being able to provide for your family
But now, it sort of feels like everything's been shaken up. Grade inflation means that good grades in high school aren't sufficient to get into university, and then you see statistics like "15% of CS grads can't find jobs", and that makes me think "is university really sufficient to get a good job?" And then getting requests by randos on the internet to do contract work for their start-up or whatever, with no formal CS or programming knowledge, and a grade 8 education, because of my projects, for entry-level wages, makes me think that a university degree really isn't even necessary for a good job. On the other hand, you see the richest people being the ones that make a big start-up then get acquired, is a good job even necessary for lots of money?
Sorry, this is rambling, but I should probably get back to work, so I'm not going to edit it.
[^1] Especially this semester, my religion teacher tends to use analogies in class that seem to be new, which messes up ChatGPT.
[^2] I feel less guilty using this method of studying for religion, specifically because in conversations with my religion teachers in the past, they've admitted to using ChatGPT to make and/or grade our tests. I know that HN people say "Oh, well, teachers are forced to use AI" or whatever, but I know that there are other teachers in my school who do not use AI.
Aurornis|2 months ago
From watching slightly younger than college age kids adapt to the current world, I think you should be glad you did’t have access to LLMs during your learning years.
It’s too easy to slip from the idea that you’re just going to use the LLM to generate study materials into thinking that you’re just going to let the LLM do this homework assignment because your tired and then into a routine where ChatGPT is doing everything because you’ve come to rely on it. Then the students get slapped in the face with a sudden bad grade because the exams are in-person and they got all the way to the end of the semester with A-graded homework despite very little understanding of the material.
SV_BubbleTime|2 months ago
This is exactly what people who know better are figuring out with vibe coding.
It’s extremely tempting for me to ask Claude to “do this thing that would take me three hours, but you only seconds”.
Many people are coming around to the realization that while that sometimes does work great, most of the time you ARE going to spend those three hours… you’re just going to spend it fixing, debugging, refactoring, instead of writing to begin with.
We are in a new era of ”no free lunch”.
merolish|2 months ago
At least in my most recent class, it's also wrecked the class discussion forums that I previously found very helpful. By the end half the students were just slop-posting entire conceptual explanations and exercises, complete with different terminology, notation, and methods than the class text. So you just skip those and look for the few students you know are actually trying.
wat10000|2 months ago
Having an LLM would turn that up to 11. Wishing you had AI in college is like wishing you had a car to train for a marathon. It’ll help a lot, if you ignore the actual goal of the work.
warmedcookie|2 months ago
Most of my professors in college gave boring, monotonous lectures from power point slides. They were simply going through the motions, so likewise I treated the work as a means to an end --a piece of paper to say I did the college thing. I had 3 professors out of the dozens I had that did not fit that mold and I studied hard so as not to make their passion null and void.
A professor's primary job is to instill interest in their students, which AI should not affect. If a student doesn't have interest or passion, whether self-taught and/or instilled, they will be mediocre at best in whatever profession they picked.
rayiner|2 months ago
koakuma-chan|2 months ago
HPsquared|2 months ago
brabel|2 months ago
SecretDreams|2 months ago
This is a very concerning statement given the implications of your post.
AI can be a tool for learning or a tool for passing. Only one of those things is beneficial for society and it's not the one short minded students in crunch time will, on average, care about.
warmedcookie|2 months ago
Memorize the things they want you to learn and move on. It's not like you are going to recall it later on because you don't have a passion or interest for it. The only things I recall in those classes are from professors who had passion in the subject, hence why I now have a weird interest in 1920s American History.
duped|2 months ago
amitav1|2 months ago
unknown|2 months ago
[deleted]
jeffbee|2 months ago
warmedcookie|2 months ago
For example, I hated English growing up and then I had a college English course with a professor who was absolutely passionate about it and made it fun. Now, I hate English a little less and could appreciate it more. We need more people like that for other subjects.
fn-mote|2 months ago
I'm positive that college lecturers fall below this baseline, but there's plenty of alternatives that a moderately motivated student could use.
Part of the problem is that the typical ~20 year old student has little idea how to learn something and little opinion about what their education should produce, to guide them.
DiggyJohnson|2 months ago
kyralis|2 months ago
watwut|2 months ago
fao_|2 months ago
LLM summarisation is broken, so I wouldn't expect them to get very far with this (see this comment on lobste.rs: https://lobste.rs/c/je7ve5 )
Also, memorizing flashcards is actually, to some point, learning the material. There's a reason why Anki is popular for students.
Ultimately, however, this comes down to the 20th+21st century problem of "students learning only for the test", which we can see has critical problems that are well-known:
https://matheducators.stackexchange.com/a/8203
https://www.youtube.com/watch?v=J6lyURyVz7k
amitav1|2 months ago
A feeling I've been having a lot recently is that I have no idea why I actually want good grades in school. When I was a kid, I was told that life went:
good grades in high school -> good university -> good job -> lots of money -> being able to provide for your family
But now, it sort of feels like everything's been shaken up. Grade inflation means that good grades in high school aren't sufficient to get into university, and then you see statistics like "15% of CS grads can't find jobs", and that makes me think "is university really sufficient to get a good job?" And then getting requests by randos on the internet to do contract work for their start-up or whatever, with no formal CS or programming knowledge, and a grade 8 education, because of my projects, for entry-level wages, makes me think that a university degree really isn't even necessary for a good job. On the other hand, you see the richest people being the ones that make a big start-up then get acquired, is a good job even necessary for lots of money?
Sorry, this is rambling, but I should probably get back to work, so I'm not going to edit it.
[^1] Especially this semester, my religion teacher tends to use analogies in class that seem to be new, which messes up ChatGPT.
[^2] I feel less guilty using this method of studying for religion, specifically because in conversations with my religion teachers in the past, they've admitted to using ChatGPT to make and/or grade our tests. I know that HN people say "Oh, well, teachers are forced to use AI" or whatever, but I know that there are other teachers in my school who do not use AI.
simonw|2 months ago
riffic|2 months ago