Every undergrad study is, as of now, possible to pass with AI models. Some of them not only pass, but pass with flying colors.
The harsh reality is that academia as a whole needs to be revamped. The easy solution would be to revert back to paper only exams, and physical attendance - but that would also exclude a ton students. A huge number of modern students are online students, or similar programs where you don't need to show up physically. Moreover, I don't think universities / colleges themselves want to revert back, as it would mean hiring more people, spending more on buildings, etc.
So you gave the easy solution. What's the hard solution?
Honestly, the pervasiveness of LLMs looks to really erode the critical thinking of entire future generations. Whatever the solution, we need to be taking these existential threats a lot more seriously than how we treated social media (the plague before this current plague).
I hold remote interviews and I can tell when candidates use AI to answer questions, real time with camera on. They repeat my technical question, pause for a few seconds, their voice drops to a monotone and they quickly recite a bulleted list of low level technical details that sounds like a wikipedia page. I worry that candidates will learn to act more subtle, maybe configure their LLM to return with an anecdote around the tech in question, and practice "selling" their vocal communication.
It may get resolved on its own. These days people study to get good grades in order to prove to future gatekeepers (like employers, or higher rungs of academia) that they know the material well. Post AGI, however, the gatekeepers may not be so interested in humans anymore, and we might not need grades at all. Studying anything could become something done exclusively for ones own interest, and the only point of a grade would be to give one-self a goal to achieve.
Alternatively, if we still want to cling on to this ritual of measuring the performance of students, you could give each and every one of them oral examinations with AI professors.
Don't even need to go that far. Provide a locked-down computer and have students write essays in a dedicated space. I have personally done that and it was a reasonably good experience.
1) Written in person exams that were most of the grade (this includes "blue book" exams where you have to sit in front of the professor and write an essay on whatever topic he writes on the board that morning as well as your typical math/algorithms tests on paper.)
2) Written homework where you have to essentially have a satisfactory discussion on the topic (no word range, you get graded on creative interpretation of the course subject matter.)
Language models could maybe help you with 2 but will actually kill your ability to handle 1 if you're cheating on homework with them. If anything language models will mean the end of those retarded make-work cookie cutter graded homework assignments that got in the way of actually studying and learning.
You're greatly exaggerating the problem. Literally any requirement can exclude students. Establishment of a trusted proctor network for administering exams is how you solve this problem. If you're an online student, you'll have to show up a few days per year to prove your knowledge in person. I believe this is how many remote study programs already work, because AI is just the latest way to cheat. You could always pay people to do your work for you before AI, and the solution is the same.
Exams were never the pinnacle of what a grad can do. They were an efficient test, under severe time constraints, that correlates well with overall ability in humans.
That AI can pass these tests doesn't mean it is as smart and capable as a grad. I mean, it might be, or if not today then in a few years, but not because it can pass exams, having digested past exams and sample solutions into its bellows.
> The easy solution would be to revert back to paper only exams, and physical attendance - but that would also exclude a ton students.
Which students?
If it's just about travel-distance, maybe schools could organize themselves to offer local test-centers where students could attend exams under observation. Reusing existing facilities in this way is pretty common in my countries education-system since decades.
A renowned South African university, UNISA, has done remote learning for decades. Students had to mail in their work every month, and they would set up exam centres all over the country where students can take their tests.
This is not an unsolvable problem if handwritten work becomes a requirement.
The cheap version of learning is dead, and AI killed it.
Not that we were learning all that much to begin with. I mean, walk into any sorority and ask to see the test bank. The students and Profs were phoning it in for a while, by and large. Not all of them were though, and good on yah.
But now that the fig leaf is torn away, we're left with the Oxbridge model and not much else - small classes, under 10, likely under 5, with a grad level tutor, social pressure making sure you've done the work. The great thing about this though is that you'll have an AI listening in all the time and helping out, streamlining the busywork and allowing the group to get down to business.
But that version is very expensive. You're looking at ~$50k / student year [0] at a baseline Oxbridge model in secondary school on up - ~$400k / student from 9th to university graduation.
Assume a 6% loan rate for 30 years (a mortgage, essentially), and you've got ~$2,300 monthly payments for all your working life, ~$46k/year down the drain. How in the hell are you going to manage student loans like that and then try to live a life without a really good job? How the hell is a nation going to be expected to pay for that per kid if you make school free for them?
Cheap learning wasn't good, but it sufficed. The new models of education must answer to the fundamental question of education: How much does it cost?
[0] 2 hours 3x a week per class; 4 classes per tutor per week. Assume $100k/tutor and 5 students/tutor. So $5k / student / tutor. 4 classes / student. So $20k / student in just raw tutors. At least double that for overhead if not triple.
AI isn't destroying anything. Don't blame the technology for what humans do with it.
AI should allow every student to have personalized instruction and tutoring. It should be a massive win.
If everyone instead of taking advantage of that refuses to do any work and decided to lie and pass the AIs output off as their own, that is not something the AI did. The students did that.
> AI should allow every student to have personalized instruction and tutoring.
I admire your optimism.
Funny how everyone has their own dream of the miracles that “AI” should perform. It's just the perfect silver screen for everyone to project their wishes on.
We turned higher ed into a qualification producing factory subsidized by the government at the expense of the kid's financial future. We overemphasized passing over learning, as the education is about the title, not the knowledge. It's not the student's fault that we created this incentive structure. The students that want to learn can still learn, those that come to higher education with a transactional mindset now can just pay for their degree. The truth is we are at the point where the logic of the commodification of our higher education system is being taken to its logical conclusion which is its own undoing.
Let's throw away the potential of society because young adults are lazy and AI must be empowered. Or, we could realize the realities of human behavior and INTELLIGENTLY integrate AI. But nah, fuck society/fuck young adults for having the typical young adult mentality.
Second this. Sick of seeing posts like this because correlation =/= causation, proved time and time again. It's just too easy to 'relate' these two things and leads to lazy writing / persisting this narrative which has in no way been proven to be true yet.
IMO the underlying cause has much more to do with a hiring cycle issue: the boom of the low-interest / free money / I-don't-need-to-pay-for-an-office covid years is now leading to the relative hiring "bust" (even though it's not really a bust, unemployment is at 4.2%, certainly nothing out of the ordinary for the US)
Even though I majored in CompSci, I still remember my college essay class and learning about primary and secondary sources and their relative quality, how to craft an argument, and how to articulate your argument to be persuasive. Outside of just writing, those skills have been useful in other scenarios too (like when subconsciously evaluating someone else's argument)
Of course, I still treated it like a lazy college student: I did it in 2.1 or 2.2 line spacing to hit the page requirements, and flipped my thesis because it was easier to research (I started out arguing against the US invading Iraq, but found it way easier to find sources that supported an invasion... well, we all know how reliable those sources were).
The leverage has been flipped. We all had awful college classes teaching next to nothing, and now that you can get good grades without attending, what's left? "We lost critical thinking!" No, we were barely getting that in the first place. Now, classes need to be more valuable.
This is exactly it. Are we surprised that civil engineering students forced to take a humanities class satisfied by psych 101 and having to pay thousands of dollars for the 3 credit hours are cheating on their term paper?
A college-friend from the Unviersity of Oxford, where students write one or two essays a week, got the top first (best mark) in his history degree. Initially impressed, one day I asked him his exam method - where each student must produce 3 essays in 3 hours (or did then) across about 5 or 6 papers. My friend’s approach was to thoroughly research 12 essay questions and pre-write 16 page essays for each paper, which he would then learn verbatim and trot out word-for-word the best fit to each exam question.
This compared to my method of reading widely, learning quotes and ideas and then writing each essay fresh in the exam hall - and I would typically manage about 3-4 pages per essay. (Reader, I did not get a top first).
I relate this anecdote as I don’t really see my friend’s method as being much better than using AI. Although I do acknowledge his 16 page essays must have been reasonably good.
Your friend's approach doesn't sound like cheating, after all the wrote the original essays.
It's more similar to spending hours preparing small exam cheat sheets, and then realizing that you didn't need them during the exam, as you had learnt the material.
> friend’s method as being much better than using AI
Why not? He wrote all the essays himself, after all, and in a setting that's much more relevant to real life vs. the artificial constraints of a shorter exam. With AI he would've written/learned nothing himself.
What if education became research? If, in the hypothetical future, the AI can answer any question about any book or scientific theory, perhaps the educational system could focus on teaching people how to come up with good ideas to research, and how to do that research effectively? Rather than making the questions about historical information more difficult, or answering them in person or writing them in bluebooks, make the process of learning about how to create new knowledge? Educators would become people who teach you how to learn, how to design questions, and how to research those questions to produce factual answers. We've known lectures have been the worst way to teach for decades. Why maintain that failed system? If the reductionist goal of the college system is a degree that certifies you as an expert in historical knowledge, maybe we can just throw that away since the AI can handle that part now, and instead certify that people know how to ask the right questions of the AI, and how to interpret their answers to create new knowledge for humanity?
Hopefully AI articles and papers can skip things like:
Alex has wavy hair and speaks with the chill, singsong cadence of someone who has spent a lot of time in the Bay Area. He and Eugene scanned the menu, and Alex said that they should get clear broth, rather than spicy, “so we can both lock in our skin care.”
Universities have been criticized for ideological indoctrination. We might be able to quantify this: the increase in use of AI to write essays should result in a weakening of this phenomenon simply due to the lower engagement in the material and reduced critical thinking as was shown recent [1] studies [2].
AI is going to increase the value of prestige education over middle-of-the-road education.
Middle of the road colleges will not have the resources to ensure that students learn despite AI, whereas the Oxbridges, etc, will retain their tutorial systems and smaller class sizes, where AI is of no use whatsoever.
A comparable phenomenon perhaps exists in the news publishing world. It was envisaged that easy access to information would be the death of pay-to-read news. However, the huge volumes of mediocre and politically-driven output that swamped the internet, airwaves, and printing presses instead increased the relative value of thoughtful and well-sourced new and writing, e.g. the FT, Guardian, BBC, etc., even the New Yorker...
The question is can a really good student who knows and understands the topic at hand write a better essay with the help of AI, than I student that doesn't know the material and is just relying on AI?
I can easily tell code written by a novice programmer naively 'vibe coding' an app from code written by an experienced developer using AI to help him. Can a history professor tell the difference between a purely AI essay from one written by someone who knows what they're talking about, and is assisted by AI to make the essay better?
> I can easily tell code written by a novice programmer naively 'vibe coding' an app from code written by an experienced developer using AI to help him. Can a history professor tell the difference between a purely AI essay from one written by someone who knows what they're talking about?
Yes. That you consider this a question worth asking is a sign of your contempt for the craft of writing an essay. If an AI is that bad at mimicking expertise in your field, why shouldn't it be that bad at mimicking expertise in others' fields?
What can help you actually write? [spoiler: not LLMs]
The thing is that most college-educated adults don't really write to one another. Nor do they really read. We're now at the point of expecting college-level texts or Tweets or coffee orders. I know myself. I've seen the beast. No fat double latte please, with which I'll go about my day and forget this digression into my feelings (separate box from children or coffee).
An obvious solution is to require the use of Google docs and include the history as part of the assignment. If there is no sign of sentence restructuring then fail the assignment.
This is the equivalent of asking students to show their work when they do math problems and that is how we thwarted those evil calculators.
This is likely easily gamed by asking the LLM to provide a number of intermediate versions of the output. You still have to do some yak shaving in google docs, but nothing too hard.
Essays might not be a great tool for providing consistent grading, but as a tool for learning to think through an idea, learning to structure arguments coherently, to research your point and find counter factual, it is unmatched. Education __should__ optimize for learning, not grading.
TrackerFF|8 months ago
The harsh reality is that academia as a whole needs to be revamped. The easy solution would be to revert back to paper only exams, and physical attendance - but that would also exclude a ton students. A huge number of modern students are online students, or similar programs where you don't need to show up physically. Moreover, I don't think universities / colleges themselves want to revert back, as it would mean hiring more people, spending more on buildings, etc.
SecretDreams|8 months ago
Honestly, the pervasiveness of LLMs looks to really erode the critical thinking of entire future generations. Whatever the solution, we need to be taking these existential threats a lot more seriously than how we treated social media (the plague before this current plague).
runamuck|8 months ago
boerseth|8 months ago
Alternatively, if we still want to cling on to this ritual of measuring the performance of students, you could give each and every one of them oral examinations with AI professors.
juujian|8 months ago
msgodel|8 months ago
1) Written in person exams that were most of the grade (this includes "blue book" exams where you have to sit in front of the professor and write an essay on whatever topic he writes on the board that morning as well as your typical math/algorithms tests on paper.)
2) Written homework where you have to essentially have a satisfactory discussion on the topic (no word range, you get graded on creative interpretation of the course subject matter.)
Language models could maybe help you with 2 but will actually kill your ability to handle 1 if you're cheating on homework with them. If anything language models will mean the end of those retarded make-work cookie cutter graded homework assignments that got in the way of actually studying and learning.
wakawaka28|8 months ago
rich_sasha|8 months ago
That AI can pass these tests doesn't mean it is as smart and capable as a grad. I mean, it might be, or if not today then in a few years, but not because it can pass exams, having digested past exams and sample solutions into its bellows.
slightwinder|8 months ago
Which students?
If it's just about travel-distance, maybe schools could organize themselves to offer local test-centers where students could attend exams under observation. Reusing existing facilities in this way is pretty common in my countries education-system since decades.
beAbU|8 months ago
This is not an unsolvable problem if handwritten work becomes a requirement.
Balgair|8 months ago
Not that we were learning all that much to begin with. I mean, walk into any sorority and ask to see the test bank. The students and Profs were phoning it in for a while, by and large. Not all of them were though, and good on yah.
But now that the fig leaf is torn away, we're left with the Oxbridge model and not much else - small classes, under 10, likely under 5, with a grad level tutor, social pressure making sure you've done the work. The great thing about this though is that you'll have an AI listening in all the time and helping out, streamlining the busywork and allowing the group to get down to business.
But that version is very expensive. You're looking at ~$50k / student year [0] at a baseline Oxbridge model in secondary school on up - ~$400k / student from 9th to university graduation.
Assume a 6% loan rate for 30 years (a mortgage, essentially), and you've got ~$2,300 monthly payments for all your working life, ~$46k/year down the drain. How in the hell are you going to manage student loans like that and then try to live a life without a really good job? How the hell is a nation going to be expected to pay for that per kid if you make school free for them?
Cheap learning wasn't good, but it sufficed. The new models of education must answer to the fundamental question of education: How much does it cost?
[0] 2 hours 3x a week per class; 4 classes per tutor per week. Assume $100k/tutor and 5 students/tutor. So $5k / student / tutor. 4 classes / student. So $20k / student in just raw tutors. At least double that for overhead if not triple.
ninetyninenine|8 months ago
ilaksh|8 months ago
AI should allow every student to have personalized instruction and tutoring. It should be a massive win.
If everyone instead of taking advantage of that refuses to do any work and decided to lie and pass the AIs output off as their own, that is not something the AI did. The students did that.
tempodox|8 months ago
I admire your optimism.
Funny how everyone has their own dream of the miracles that “AI” should perform. It's just the perfect silver screen for everyone to project their wishes on.
javier123454321|8 months ago
_DeadFred_|8 months ago
eviks|8 months ago
But wouldn't, so we only have the loss of cheating replacing learning.
spacemadness|8 months ago
ImHereToVote|8 months ago
fullstackchris|8 months ago
IMO the underlying cause has much more to do with a hiring cycle issue: the boom of the low-interest / free money / I-don't-need-to-pay-for-an-office covid years is now leading to the relative hiring "bust" (even though it's not really a bust, unemployment is at 4.2%, certainly nothing out of the ordinary for the US)
atonse|8 months ago
Of course, I still treated it like a lazy college student: I did it in 2.1 or 2.2 line spacing to hit the page requirements, and flipped my thesis because it was easier to research (I started out arguing against the US invading Iraq, but found it way easier to find sources that supported an invasion... well, we all know how reliable those sources were).
elmean|8 months ago
compacct27|8 months ago
javier123454321|8 months ago
Biologist123|8 months ago
This compared to my method of reading widely, learning quotes and ideas and then writing each essay fresh in the exam hall - and I would typically manage about 3-4 pages per essay. (Reader, I did not get a top first).
I relate this anecdote as I don’t really see my friend’s method as being much better than using AI. Although I do acknowledge his 16 page essays must have been reasonably good.
pcrh|8 months ago
It's more similar to spending hours preparing small exam cheat sheets, and then realizing that you didn't need them during the exam, as you had learnt the material.
eviks|8 months ago
Why not? He wrote all the essays himself, after all, and in a setting that's much more relevant to real life vs. the artificial constraints of a shorter exam. With AI he would've written/learned nothing himself.
VeritySage|8 months ago
mistrial9|8 months ago
adrianhon|8 months ago
danjl|8 months ago
nickk81|8 months ago
Alex has wavy hair and speaks with the chill, singsong cadence of someone who has spent a lot of time in the Bay Area. He and Eugene scanned the menu, and Alex said that they should get clear broth, rather than spicy, “so we can both lock in our skin care.”
politician|8 months ago
[1] https://time.com/7295195/ai-chatgpt-google-learning-school/ [2] https://www.microsoft.com/en-us/research/wp-content/uploads/...
pcrh|8 months ago
Middle of the road colleges will not have the resources to ensure that students learn despite AI, whereas the Oxbridges, etc, will retain their tutorial systems and smaller class sizes, where AI is of no use whatsoever.
A comparable phenomenon perhaps exists in the news publishing world. It was envisaged that easy access to information would be the death of pay-to-read news. However, the huge volumes of mediocre and politically-driven output that swamped the internet, airwaves, and printing presses instead increased the relative value of thoughtful and well-sourced new and writing, e.g. the FT, Guardian, BBC, etc., even the New Yorker...
dagw|8 months ago
I can easily tell code written by a novice programmer naively 'vibe coding' an app from code written by an experienced developer using AI to help him. Can a history professor tell the difference between a purely AI essay from one written by someone who knows what they're talking about, and is assisted by AI to make the essay better?
jcranmer|8 months ago
Yes. That you consider this a question worth asking is a sign of your contempt for the craft of writing an essay. If an AI is that bad at mimicking expertise in your field, why shouldn't it be that bad at mimicking expertise in others' fields?
drewcoo|8 months ago
What can help you actually write? [spoiler: not LLMs]
The thing is that most college-educated adults don't really write to one another. Nor do they really read. We're now at the point of expecting college-level texts or Tweets or coffee orders. I know myself. I've seen the beast. No fat double latte please, with which I'll go about my day and forget this digression into my feelings (separate box from children or coffee).
eviks|8 months ago
> He then transcribed Claude’s points in his notebook, since his professor ran a screen-free classroom.
tlhunter|8 months ago
This is the equivalent of asking students to show their work when they do math problems and that is how we thwarted those evil calculators.
whiplash451|8 months ago
tempodox|8 months ago
eviks|8 months ago
nh23423fefe|8 months ago
Double_a_92|8 months ago
javier123454321|8 months ago
Molitor5901|8 months ago
bookofjoe|8 months ago