(no title)
hyporthogon | 3 years ago
Doesn't this mean Sydney can already alter the 'outside' (non-bing.com) world?
Sure, anything can issue http GETs -- doing this not a super power. And sure, Roy Fielding would get mad at you if your web service mutated anything (other than whatever the web service has to physically do in order to respond) in response to a GET. But plenty of APIs do this. And there are plenty of http GET exploits available public database (just do a CVE search) -- which Sydney can read.
So okay fine say Sydney is "just" a 'stochastically parroting a h4xx0rr'. But...who cares if the poisonous GET was actually issued to some actual machine somewhere on the web?
(I can't imagine how any LLM wrapper could build in an 'override rule' like 'no non-bing.com requests when you are sufficiently [simulating an animate being who is] pissed off'. But I'm way not expert in LLMs or GPT or transformers in general.)
nr2x|3 years ago
joe_the_user|3 years ago
hyporthogon|3 years ago
But I don't know if Bing (or whatever index Sydney/Bing can access) respects noindex and don't know how else to try to guarantee the index Sydney/Bing can access will not have crawled any url.
edgoode|3 years ago
wildrhythms|3 years ago
My guess is that it's not actually making HTTP requests; it's using cached versions of pages that the Bing crawler has already collected.