It's read them in the same sense as it's written a new one: it's ingested them, had what pass for its mental processes influenced by them, remembered some of their contents, etc.
You might want to say that that isn't "reading", just as we never say that aeroplanes "fly" since they don't do the same thing as birds, never say that computers "play chess" since the calculations they do are very different from those done by human chessplayers, never say that machines "dig ditches" since they don't have the experience of tired muscles and the sun beating down on their backs, etc. As you may gather from the tone of the previous sentence, I am not altogether convinced.
I do agree that the two things aren't equivalents. I can imagine futures in which AI systems have a training process that somewhat resembles the present one, and also do something that corresponds more closely to human reading[1], and then I'd want to reserve the term "read" for the latter. What Claude has done to all the books that helped shape it isn't exactly reading. But it's quite like reading for our present purpose; when someone jokes about an author having written more books than they've read, they mean that the author doesn't have much awareness of other people's work, and that isn't a problem Claude has.
[1] E.g., they might have some sort of awareness of the process, whatever exactly that might mean; they might do it "for pleasure", whatever exactly that might mean; etc. Those things would require the AI systems to resemble humans in ways present AI systems aren't designed to and, so far as I can tell, don't; maybe some future AI systems will be that way, maybe not.
gjm11|9 months ago
You might want to say that that isn't "reading", just as we never say that aeroplanes "fly" since they don't do the same thing as birds, never say that computers "play chess" since the calculations they do are very different from those done by human chessplayers, never say that machines "dig ditches" since they don't have the experience of tired muscles and the sun beating down on their backs, etc. As you may gather from the tone of the previous sentence, I am not altogether convinced.
I do agree that the two things aren't equivalents. I can imagine futures in which AI systems have a training process that somewhat resembles the present one, and also do something that corresponds more closely to human reading[1], and then I'd want to reserve the term "read" for the latter. What Claude has done to all the books that helped shape it isn't exactly reading. But it's quite like reading for our present purpose; when someone jokes about an author having written more books than they've read, they mean that the author doesn't have much awareness of other people's work, and that isn't a problem Claude has.
[1] E.g., they might have some sort of awareness of the process, whatever exactly that might mean; they might do it "for pleasure", whatever exactly that might mean; etc. Those things would require the AI systems to resemble humans in ways present AI systems aren't designed to and, so far as I can tell, don't; maybe some future AI systems will be that way, maybe not.