This really is completely independent of "AI". I remember seeing a famous presentation (maybe it was a TED talk?) about the "echo chamber" effect of personalization from the mid 00s - and IMO nearly all of the negative impacts discussed in that talk came to pass and then some. AI just makes it worse.
This seems fundamentally different. Filter bubbles show you more of the externally generated content you engage with. These personalizations are trying to predict the content you generate.
While it may serve as a ballast for your personal voice changing over time, the whole point is to learn you not to feed you.
I worry about this. I happen to have done quite a lot of programming in Lisp dialects over the last decade or so but since adopting Gpt4 I tend to just code in Python because that is what the model understands best. It does seem like AI will enhance network effects by increasing the efficiency difference between technologies that the AI knows and those that it doesn't. Kind of depressing.
I wonder how much of this is syntactic familiarity (from training) and how much of this is needing to attend to balanced parentheses.
I don't use lisp often enough to have played with getting GPT to lisp with me, but I have played a bit with getting it to read and write Datalog (which I suspect is even more scarce in The Pile dumps). It's ok at recognition but misses details. I haven't seen it produce much of value yet. But it can write JavaScript for days, and has no problem balancing parentheses and brackets there, even without compiler/tree-sitter support.
If I had spare experimenting bandwidth I would look into whether fine-tuning for Lisp format and conventions would show a significant boost in performance..
[+] [-] hn_throwaway_99|1 year ago|reply
Edit: I think it was this I was referring to about "filter bubbles", https://youtu.be/B8ofWFx525s?si=rK1T-v5D0sAeiHJe . I was a tad mistaken, this was from 2011.
[+] [-] wanderingbort|1 year ago|reply
While it may serve as a ballast for your personal voice changing over time, the whole point is to learn you not to feed you.
[+] [-] nathan_compton|1 year ago|reply
[+] [-] kevindamm|1 year ago|reply
I don't use lisp often enough to have played with getting GPT to lisp with me, but I have played a bit with getting it to read and write Datalog (which I suspect is even more scarce in The Pile dumps). It's ok at recognition but misses details. I haven't seen it produce much of value yet. But it can write JavaScript for days, and has no problem balancing parentheses and brackets there, even without compiler/tree-sitter support.
If I had spare experimenting bandwidth I would look into whether fine-tuning for Lisp format and conventions would show a significant boost in performance..
[+] [-] wanderingbort|1 year ago|reply
Any pressure you feel to adopt python is not because it has detected you enjoy python, it’s because it’s global training data skewed to python.
Its a huge concern but, not this article’s concern I think.
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] szundi|1 year ago|reply
Some companies are 10^7x, while 10^7 others are said goodbye?
[+] [-] unknown|1 year ago|reply
[deleted]