I won't ever put my name on something written by an LLM, and I will blacklist any site or person I see doing it. If I want to read LLM output I can prompt it myself, subjecting me to it and passing it off as your own is disrespectful.
As the author says, there will certainly be a number of people who decide to play with LLM games or whatever, and content farms will get even more generic while having less writing errors, but I don't think that the age of communicating thought, person to person, through text is "over".
It's easy to output LLM junk, but I and my colleagues are doing a lot of incredible work that simply isn't possible without LLMs involved. I'm not talking a 10 turn chat to whip out some junk. I'm talking deep research and thinking with Opus to develop ideas. Chats where you've pressure tested every angle, backed it up with data pulled in from a dozen different places, and have intentionally guided it towards an outcome. Opus can take these wildly complex ideas and distill them down into tangible, organized artifacts. It can tune all of that writing to your audience, so they read it in terms they're familiar with.
Reading it isn't the most fun, but let's face it - most professional reading isn't the most fun. You're probably skimming most of the content anyways.
Our customers don't care how we communicate internally. They don't care if we waste a bunch of our time rewriting perfectly suitable AI content. They care that we move quickly on solving their problems - AI let's us do that.
I assume if someone used an LLM to write for them that they must not be comfortabley familiar with their subject. Writing about something you know well tends to come easy and usually is enjoyable. Why would you use an LLM for that and how could you be okay with its output?
Axios got traction because it heavily condensed news into more scannable content for the twitter, insta, Tok crowd.
So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.
AI condenses it all on a surreal and unsettling timeline. But humans are still humans.
And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.
Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.
And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".
> Additionally, not all writing serves the same purpose.
I think this is a really important point and to add on, there is a lot of writing that is really good, but only in a way that a niche audience can appreciate. Today's AI can basically compete with the low quality stuff that makes up most of social media, it can't really compete with higher quality stuff targeted to a general audience, and it's still nowhere close to some more niche classics.
An interesting thought experiment is whether it's possible that AI tools could write a novel that's better than War and Peace. A quick google shows a lot of (poorly written) articles about how "AI is just a machine, so it can never be creative," which strikes me as a weak argument way too focused on a physical detail instead of the result. War and Peace and/or other great novels are certainly in the training set of some or all models, and there is some real consensus about which ones are great, not just random subjective opinions.
I kind of think... there is still something fundamental that would get in the way, but that it is still totally achievable to overcome that some day? I don't think it's impossible for an AI to be creative in a humanlike way, they don't seem optimized for it because they are completely optimized for the sort of analytical mode of reading and writing, not the creative/immersive one.
I have this theory that the post-truth era began with the invention of the printing press and gained iteratively more traction with each revolution in information technology.
Same. New yorker is the other mag I subscribed to.
Until 3 weeks ago I had a high cortisol inducing morning read: nyt, wsj, axios, politico. I went on a weeklong camping trip with no phone and haven't logged into those yet. It's fine.
"Is Claude Code junk food, though? ... although I have barely written a line of code on my own, the cognitive work of learning the architecture — developing a new epistemological framework for “how developers think” — feels real."
Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?
I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?
Thank you, this sort of insight is exactly why I've felt such kinship with what software engineers like Karpathy and Simon Willison have been writing lately. It seems obvious to me that there is something special and irreplaceable about the thought processes that create good code.
However, I think there is also something qualitatively different about how work is done in these two domains.
Example: refactoring a codebase is not really analogous to revising a nonfiction book, even though they both involve rewriting of a sort. Even before AI, the former used far more tooling and automated processes. There is, e.g., no ESLint for prose which can tell you which sentences are going to fail to "compile" (i.e., fail to make sense to a reader).
The special taste or skillset of a programmer seems to me to involve systems thinking and tool use in a different way than the special taste of a writer, which is more about transmuting personal life experiences and tacit knowledge into words, even if tools (word processor) and systems (editors, informants, primary sources) are used along the way.
Sort of half formed ideas here but I find this a really rich vein of thought to work through. And one of the points of my post is that writing is about thinking in public and with a readership. Many thanks for helping me do that.
I don't have a good answer to your question, but I do think it might be comparable, yes. If you had good taste about what to get Opus 4.6 to write, and kept iterating on it in a way that exposes the results to public view, I think you'd definitely develop a more fine grained sense of the epistemological perspective of a writer. But you wouldn't be one any more than I'm a software developer just because I've had Claude Code make a lot of GitHub commits lately (if anyone's interested: https://github.com/benjaminbreen).
> Could such a statement ever apply to something as crass as software development?
Absolutely. I think like a Python programmer, a very specific kind of Python programmer after a decade of hard lessons from misusing the freedom it gives you in just about every way possible.
I carry that with me in how I approach C++ and other languages. And then I learned some hard lessons in C++ that informed my Python.
The tools you have available definitely inform how you think. As your thinking evolves, so does your own style. It's not just the tool, mind, but also the kinds of things you use it for.
I overheard a conversation between a uni professor and a phd student the other day. Professor was complaining 99% his students use chatgpt to write essays in uni. He seemed genuinely distressed about the effect this was having on all of them.
i cant wait for the reverse effect to happen , where everyone themselves start sounding like large language models ... a true singularity where AI colonizes the noosphere instead of earth
I don't really remember Claude 3.5 doing this, but it seems increasingly worse, with 4.6 being so bad I don't like using it for brainstorming. My shitty idea isn't "genuinely elegant".
Your sample sounds exactly like an LLM. (If you wrote it yourself, kudos.)
But, it needn't sound like this. For example, I can have Opus rewrite that block of text into something far more elegant (see below).
It's like everyone has a new electric guitar with the cheapo included pedal, and everyone is complaining that their instruments all sound the same. Well, no shit. Get rid of the freebie cheapo pedal and explore some of the more sophisticated sounds the instrument can make.
----
There is a particular cadence that has become unmistakable: clipped sentences, stacked like bricks without mortar, each one arriving with the false authority of an aphorism while carrying none of the weight. It is not merely tedious or disjointed; it is something closer to uncanny, a fluency that mimics the shape of human thought without ever inhabiting it.
Set this against writing that breathes, prose with genuine rhythm, with the courage to sustain a sentence long enough to discover something unexpected within it, and the difference is not subtle. It is the difference between a voice and an echo, between a face and a mask that almost passes for one.
What masquerades as wisdom here is really only pattern. What presents itself as professionalism is only smoothness. And what feels, for a fleeting moment, like originality is simply the recombination of familiar gestures, performed with enough confidence to delay recognition of their emptiness.
The frustration this provokes is earned. There is something genuinely dispiriting about watching institutions reach for the synthetic when the real thing, imperfect, particular, alive, remains within arm's length. That so many have made this choice is not a reflection on the craft of writing. It is a reflection on the poverty of attention being paid to it.
And if all of this sounds like it arrives at a convenient conclusion, one that merely flatters the reader's existing suspicion, well, perhaps that too is worth sitting with a moment longer than is comfortable.
----
(prompt used: I want you to revise [pasted in your text], making it elegant and flowing with a mature literary-style. The point of this exercise is to demonstrate how this sample text -- held up as an example of the stilted LLM style -- can easily be made into something more beautiful with a creative prompt. Avoid gramatical constructions that call for m-dashes.)
> Anyone who has led a class discussion — much less led students on a tour of Egypt or Okinawa, as my colleagues regularly do — knows that there is a huge gap between solo learning online and collective learning in meat space
One thing this author misses, which I fear, is that it may become less important in the eyes of stakeholders to educate the masses when they have LLMs to do jobs instead. That is, it is fully possible that one of the futures we may see is one where education goes down as it is perceived as not important for most. Yes, meat space education may be better, but who decides if it is necessary?
Maybe vocational schools become more important instead? Jobs where you for all intents and purposes build out the infrastructure for the tetriary industry, mostly automated by LLM.
You may disagree with this, but the key here is to realize that even if we disagree, others don't. Education is also power, there's a perverse incentive to avoid educating people and feeding them with your narrative of how the world works instead. We are very much possibly on the way towards a buy-n-large style future.
As much as the general public seems to be turning against AI, people only seem to care when they're aware it's AI. Those of us intentionally aware of it are better tuned to identify LLM-speak and generated slop.
Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)
What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.
Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.
From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.
I spent several years trying to get ground truth out of digital medical records and I would draw this parallel to AI slop:
With traditional medical records, you could see what the practitioner did and covered because only that was in the record.
With computerized records, the intent, thought process, most signal you would use to validate internal consistency, was hidden behind a wall of boilerplate and formality that armored the record against scrutiny.
Bad writing on LinkedIn is self-evident. Everything about it stinks.
AI slop is like a Trojan Horse for weak, undeveloped thoughts. They look finished, so they sneak into your field of view and consume whatever additional attention is required to finally realize that despite the slick packaging, this too is trash.
So “AI slop,” in this worldview, is a complaint that historical signals of quality simply based on form, no longer are useful gatekeepers for attention.
Did we lose something when we invented the calculator and stopped teaching the times table in schools? There have been millions of words discussing this, and the general consensus amongst us crusty old folks was that yes, the times table was useful and losing the ability to do mental arithmetic easily would be bad.
Turns out we were wrong. Everyone carries a calculator now on their phone, even me. Doing simple maths is a matter of moments on the calculator app, and it's rare that I find myself doing the mental arithmetic that used to be common.
I can't remember phone numbers any more. I used to have a good 50+ memorised, now I can barely remember my own. But the point is that I don't need to any more. We have machines for that.
Do we need to be able to write an essay? I have never written one outside of an educational context. And no, this post does not count as an essay.
I was expelled from two kindergartens as a kid. I was finally moved to a Montessori school where they taught individually by following our interests, where I thrived. Later, I moved back into a more conventional educational environment and I fucking hated every minute of it. I definitely learned despite my education not because if it. So if LLMs are about to completely disrupt education then I celebrate that. This is a good thing. Giving every kid a personal tutor that can follow their interests and teach them things that they actually want to learn, at the pace they want to learn them, is fucking awesome.
I wonder whether we will see a shift back toward human generated, organic content, writing that is not perfectly polished or exhaustively articulated. For an LLM, it is effortless to smooth every edge and fully flesh out every thought. For humans, it is not.
After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being
If LLMs presently aren't capable of matching the style quirks you're describing, isn't it likely they'll be able to in the near future? To me this feels like a problem that'll either need to be addressed legally or left to authors to somehow convince their audiences to trust that their work is their own.
About the article that's referenced in the beginning - that sentiment presented in it honestly sounds like AI version of cryptocurrency euphoria just as the bubble burst. "You are not ready for what's going to happen to the economy", "crypto will replace tradfi, experts agree". The article is sitting at almost 100M views after just a week and has strong FOMO vibes. To be honest, it's very conflicting for me to believe that, because I've been using AI and compared to crypto, it doesn't just feel like magic, it also does magic. However, I can't help but think of this parallel and the possibilty that somehow the AI bubble could right now be starting to stall/regress. The only problem is that I just don't see how such a scenario would play out, given how good and useful these tools are
Something I realised a while ago: everyone can write and that makes it very hard to stand out as a writer, and make a career out of writing. Same thing with singing more or less (although it's harder to sing well than write at all well).
I think I realised that while reading Harry Potter. To be fair the writing in the books is abysmally bad. It's written by an adult woman but it comes across as the writing of a 14 year old child, and that's to be charitable.
And it doesn't matter one bit. It still became the best-selling book in history with 600 million copies sold worldwide (as Wikipedia tells me). That's not to say that there aren't many hundreds, possibly even thousands of better written series, even in the Young Adult space. There are. But they're not that successful.
Why? I guess because good writing doesn't matter so much as what's being written. And I guess that also doesn't matter that much. You just have to connect somehow, be in the right place at the right time, when the need to read a certain piece of writing sort of emerges naturally as a result of whatever forces shape ambient taste.
Who knows. But most people wouldn't know what good writing looks like anymore than they could write well themselves, so it's obvious that the ability to write well is over-rated.
And so now we have LLMs generating prose and that's what we'll be reading henceforth. I think it will be gradual, but it's unavoidable. One day nobody will read anything anyone else has written anymore. Why do that? If you can just ask an LLM to generate whatever you want to read?
I think people hate AI generated writing more than they like human curated writing. At the same time, I find that people like AI content more than my writing. I write, comment, and blog in many different places and I notice that my AI generated content does much better in terms of engagement. I'm not a writer, I code, so it might be that my writing is not professional. Whereas my code-by-hand still edges out against AI.
We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.
Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.
How do you know if your engagement was by real humans or not? I'd also assume bot traffic is way more accepted on platforms like Facebook, Instagram, and Twitter. Especially any Meta owned platform, they have a history of lying to people about numbers and were never punished for it:
I agree with the assessment that pure writing (by a human) is over. Content is going to matter a lot more.
It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.
There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).
I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.
Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.
That is a shallow piece of the new genre: I am a concerned academic who nevertheless uses these new tools to create vibe coded slop and has to tell the world about it.
Everything is inevitable but my own job is secure. Have I already told you how concerned I am?
No novelty. No intellectual challenge. No spirit. Just AI advertisements! /s
I had this worry at first but at this point we have hundreds of years of books written using legacy methods the best of what was possible already exists it's time for a change
In the near future will not even need to read anyway
For hundreds of years we've avoided eating rocks, just based on so-called "conventional wisdom". Witness all the problems we now have in the world. Well I, for one, am ready for a change. It's time to do things differently. If you're fed up with the status quo, it's time to start eating rocks.
ericdykstra|11 days ago
As the author says, there will certainly be a number of people who decide to play with LLM games or whatever, and content farms will get even more generic while having less writing errors, but I don't think that the age of communicating thought, person to person, through text is "over".
SkyPuncher|11 days ago
Reading it isn't the most fun, but let's face it - most professional reading isn't the most fun. You're probably skimming most of the content anyways.
Our customers don't care how we communicate internally. They don't care if we waste a bunch of our time rewriting perfectly suitable AI content. They care that we move quickly on solving their problems - AI let's us do that.
dw_arthur|11 days ago
botusaurus|11 days ago
just like when you go to a restaurant to have a chef cook for you when you can cook yourself
apsurd|11 days ago
So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.
AI condenses it all on a surreal and unsettling timeline. But humans are still humans.
And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.
Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.
And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".
Same story.
ngriffiths|11 days ago
I think this is a really important point and to add on, there is a lot of writing that is really good, but only in a way that a niche audience can appreciate. Today's AI can basically compete with the low quality stuff that makes up most of social media, it can't really compete with higher quality stuff targeted to a general audience, and it's still nowhere close to some more niche classics.
An interesting thought experiment is whether it's possible that AI tools could write a novel that's better than War and Peace. A quick google shows a lot of (poorly written) articles about how "AI is just a machine, so it can never be creative," which strikes me as a weak argument way too focused on a physical detail instead of the result. War and Peace and/or other great novels are certainly in the training set of some or all models, and there is some real consensus about which ones are great, not just random subjective opinions.
I kind of think... there is still something fundamental that would get in the way, but that it is still totally achievable to overcome that some day? I don't think it's impossible for an AI to be creative in a humanlike way, they don't seem optimized for it because they are completely optimized for the sort of analytical mode of reading and writing, not the creative/immersive one.
plastic-enjoyer|11 days ago
I have this theory that the post-truth era began with the invention of the printing press and gained iteratively more traction with each revolution in information technology.
meetingthrower|11 days ago
Until 3 weeks ago I had a high cortisol inducing morning read: nyt, wsj, axios, politico. I went on a weeklong camping trip with no phone and haven't logged into those yet. It's fine.
dtf|11 days ago
Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?
I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?
girvo|11 days ago
While the same people in the same comments say it’s fine to replace programming with it
When pressed they talk about creativity, as if software development has none…
benbreen|11 days ago
However, I think there is also something qualitatively different about how work is done in these two domains.
Example: refactoring a codebase is not really analogous to revising a nonfiction book, even though they both involve rewriting of a sort. Even before AI, the former used far more tooling and automated processes. There is, e.g., no ESLint for prose which can tell you which sentences are going to fail to "compile" (i.e., fail to make sense to a reader).
The special taste or skillset of a programmer seems to me to involve systems thinking and tool use in a different way than the special taste of a writer, which is more about transmuting personal life experiences and tacit knowledge into words, even if tools (word processor) and systems (editors, informants, primary sources) are used along the way.
Sort of half formed ideas here but I find this a really rich vein of thought to work through. And one of the points of my post is that writing is about thinking in public and with a readership. Many thanks for helping me do that.
I don't have a good answer to your question, but I do think it might be comparable, yes. If you had good taste about what to get Opus 4.6 to write, and kept iterating on it in a way that exposes the results to public view, I think you'd definitely develop a more fine grained sense of the epistemological perspective of a writer. But you wouldn't be one any more than I'm a software developer just because I've had Claude Code make a lot of GitHub commits lately (if anyone's interested: https://github.com/benjaminbreen).
randusername|11 days ago
Absolutely. I think like a Python programmer, a very specific kind of Python programmer after a decade of hard lessons from misusing the freedom it gives you in just about every way possible.
I carry that with me in how I approach C++ and other languages. And then I learned some hard lessons in C++ that informed my Python.
The tools you have available definitely inform how you think. As your thinking evolves, so does your own style. It's not just the tool, mind, but also the kinds of things you use it for.
unknown|11 days ago
[deleted]
raincole|11 days ago
I'm still waiting for a famous people to say this so we can have a name of this psychological phenomenon.
daymos|11 days ago
hackit2|11 days ago
I have my own personal reservation about it all.
Aeglaecia|11 days ago
AstroBen|11 days ago
You know the one.
Choppy. Fast. Saying nothing at all.
It's not just boring and disjointed. It's full-on slop via human-adjacent mimicry.
Let’s get very clear, very grounded, and very unsentimental for a moment.
The contrast to good writing is brutal, and not in a poetic way. In a teeth-on-edge, stomach-dropping way. The dissonance is violent.
Here's the raw truth:
It’s not wisdom. It’s not professional. It’s not even particularly original.
You are very right to be angry. Brands picking soulless drivel over real human creatives.
And now we finish with a pseudo-deep confirmation of your bias.
---
Before long everyone will be used to it and it'll evoke the same eugh response
Sometimes standing out or wuality writing doesn't actually matter. Let AI do that part
jihadjihad|11 days ago
tomjakubowski|11 days ago
nomel|11 days ago
getnormality|11 days ago
Does the fact that a machine can ape it so easily somehow reveal its vacuousness in a way that wasn't obvious already?
I keep hearing people with job titles like "SEO growth hacker" saying it's depressing that AI can do their jobs better than they can.
Really? That's the depressing part?
lurquer|11 days ago
Your sample sounds exactly like an LLM. (If you wrote it yourself, kudos.)
But, it needn't sound like this. For example, I can have Opus rewrite that block of text into something far more elegant (see below).
It's like everyone has a new electric guitar with the cheapo included pedal, and everyone is complaining that their instruments all sound the same. Well, no shit. Get rid of the freebie cheapo pedal and explore some of the more sophisticated sounds the instrument can make.
----
There is a particular cadence that has become unmistakable: clipped sentences, stacked like bricks without mortar, each one arriving with the false authority of an aphorism while carrying none of the weight. It is not merely tedious or disjointed; it is something closer to uncanny, a fluency that mimics the shape of human thought without ever inhabiting it.
Set this against writing that breathes, prose with genuine rhythm, with the courage to sustain a sentence long enough to discover something unexpected within it, and the difference is not subtle. It is the difference between a voice and an echo, between a face and a mask that almost passes for one.
What masquerades as wisdom here is really only pattern. What presents itself as professionalism is only smoothness. And what feels, for a fleeting moment, like originality is simply the recombination of familiar gestures, performed with enough confidence to delay recognition of their emptiness.
The frustration this provokes is earned. There is something genuinely dispiriting about watching institutions reach for the synthetic when the real thing, imperfect, particular, alive, remains within arm's length. That so many have made this choice is not a reflection on the craft of writing. It is a reflection on the poverty of attention being paid to it.
And if all of this sounds like it arrives at a convenient conclusion, one that merely flatters the reader's existing suspicion, well, perhaps that too is worth sitting with a moment longer than is comfortable.
----
(prompt used: I want you to revise [pasted in your text], making it elegant and flowing with a mature literary-style. The point of this exercise is to demonstrate how this sample text -- held up as an example of the stilted LLM style -- can easily be made into something more beautiful with a creative prompt. Avoid gramatical constructions that call for m-dashes.)
vpribish|11 days ago
and at the same time the chop becomes long-form slop, stretching out a little seed of a human prompt into a sea of inane prose.
petterroea|11 days ago
One thing this author misses, which I fear, is that it may become less important in the eyes of stakeholders to educate the masses when they have LLMs to do jobs instead. That is, it is fully possible that one of the futures we may see is one where education goes down as it is perceived as not important for most. Yes, meat space education may be better, but who decides if it is necessary?
Maybe vocational schools become more important instead? Jobs where you for all intents and purposes build out the infrastructure for the tetriary industry, mostly automated by LLM.
You may disagree with this, but the key here is to realize that even if we disagree, others don't. Education is also power, there's a perverse incentive to avoid educating people and feeding them with your narrative of how the world works instead. We are very much possibly on the way towards a buy-n-large style future.
ayoung5555|11 days ago
Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)
What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.
Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.
From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.
unyttigfjelltol|11 days ago
With traditional medical records, you could see what the practitioner did and covered because only that was in the record.
With computerized records, the intent, thought process, most signal you would use to validate internal consistency, was hidden behind a wall of boilerplate and formality that armored the record against scrutiny.
Bad writing on LinkedIn is self-evident. Everything about it stinks.
AI slop is like a Trojan Horse for weak, undeveloped thoughts. They look finished, so they sneak into your field of view and consume whatever additional attention is required to finally realize that despite the slick packaging, this too is trash.
So “AI slop,” in this worldview, is a complaint that historical signals of quality simply based on form, no longer are useful gatekeepers for attention.
marcus_holmes|11 days ago
Turns out we were wrong. Everyone carries a calculator now on their phone, even me. Doing simple maths is a matter of moments on the calculator app, and it's rare that I find myself doing the mental arithmetic that used to be common.
I can't remember phone numbers any more. I used to have a good 50+ memorised, now I can barely remember my own. But the point is that I don't need to any more. We have machines for that.
Do we need to be able to write an essay? I have never written one outside of an educational context. And no, this post does not count as an essay.
I was expelled from two kindergartens as a kid. I was finally moved to a Montessori school where they taught individually by following our interests, where I thrived. Later, I moved back into a more conventional educational environment and I fucking hated every minute of it. I definitely learned despite my education not because if it. So if LLMs are about to completely disrupt education then I celebrate that. This is a good thing. Giving every kid a personal tutor that can follow their interests and teach them things that they actually want to learn, at the pace they want to learn them, is fucking awesome.
lurquer|11 days ago
the same people telling us that "Finnegan's Wake" (written in the style of a fifth-grader with a brain injury) is 'art'...
the same people telling us the poetry of Maya Angelou (written in the style of a fifth-grader with a brain injury and self-esteem issues) is 'art'...
the same people telling us that the works of Jackson Pollack, Mark Rothko, Piet Mondrian, etc., etc. are 'art'...
seem to be the ones complaining the most about AI generated content.
unknown|10 days ago
[deleted]
mkehrt|10 days ago
submeta|11 days ago
After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being
xfil|11 days ago
pawelduda|11 days ago
YeGoblynQueenne|10 days ago
I think I realised that while reading Harry Potter. To be fair the writing in the books is abysmally bad. It's written by an adult woman but it comes across as the writing of a 14 year old child, and that's to be charitable.
And it doesn't matter one bit. It still became the best-selling book in history with 600 million copies sold worldwide (as Wikipedia tells me). That's not to say that there aren't many hundreds, possibly even thousands of better written series, even in the Young Adult space. There are. But they're not that successful.
Why? I guess because good writing doesn't matter so much as what's being written. And I guess that also doesn't matter that much. You just have to connect somehow, be in the right place at the right time, when the need to read a certain piece of writing sort of emerges naturally as a result of whatever forces shape ambient taste.
Who knows. But most people wouldn't know what good writing looks like anymore than they could write well themselves, so it's obvious that the ability to write well is over-rated.
And so now we have LLMs generating prose and that's what we'll be reading henceforth. I think it will be gradual, but it's unavoidable. One day nobody will read anything anyone else has written anymore. Why do that? If you can just ask an LLM to generate whatever you want to read?
kittikitti|11 days ago
We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.
Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.
shimman|11 days ago
https://en.wikipedia.org/wiki/Pivot_to_video#Facebook_metric...
jongjong|11 days ago
It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.
There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).
I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.
Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.
Aldipower|11 days ago
apsurd|11 days ago
kittbuilds|11 days ago
[deleted]
rnakle|11 days ago
Everything is inevitable but my own job is secure. Have I already told you how concerned I am?
No novelty. No intellectual challenge. No spirit. Just AI advertisements! /s
1necornbuilder|11 days ago
[deleted]
piker|11 days ago
sibeliuss|11 days ago
eaglelamp|11 days ago
[deleted]
davtyan1202|11 days ago
[deleted]
caseyohara|11 days ago
hifathom|11 days ago
[deleted]
htnthrow11220|11 days ago
indiekitai|11 days ago
[deleted]
rbtprograms|11 days ago
if it isn't, then it has seeped into your writing style and its quite a turn off as a reader; i dont care much to engage.
if it is, then why should i read it? what come to this website and even bother reading AI bot comments?
what is happening to writing indeed.
mamma_mia|11 days ago
In the near future will not even need to read anyway
recursive|11 days ago