top | item 34753149

(no title)

hellbanTHIS | 3 years ago

I haven't used BingGPT but just for example having chatGPT summarize news articles about things it doesn't want to talk about is bizarre - Here's an example of it summarizing a story about a city councilman that was murdered: https://i.imgur.com/QV9jAkp.jpg it completely ignored the murder part and said he switched parties, which it totally made up. A second attempt said he was "shot and changed" because apparently it didn't want to say "killed".

A game Reddit was playing is trying to get it to respond as Woodrow Wilson, a famously racist president - the most accurate thing I could get it to do was this: https://i.imgur.com/D8JLziW.jpg which is not very accurate. Try getting it to act like Sheriff Bull Connor and it will refuse, but it has to comply for a president so it gives a totally misleading impression with major factual errors.

And these are the times I actually got it to respond, it seems 50% of the time it takes offense to something innocuous and scolds you for asking.

discuss

order

ajross|3 years ago

So, here's the thing. I have no idea, like zero, what news event you're talking about in that first screenshot. So, what did I do? I went to Google to try to find something about a murdered city councilman, even looking for an equivalent on NewsBusters, which the AI cites as a source.

And I still can't find it. I can see some stuff that's maybe related? But nothing clear.

So... I guess I repeat. Your problem isn't "AI censorship", it's that no one wants to link to NewsBusters because of marketing concerns. If I had to guess there just wasn't any training data relevant to your query.

(Also: NewsBusters is a garbage site, you know that, right?)

BoiledCabbage|3 years ago

Did you try and ask for a summary of the article without actually providing the content of the article? ChatGPT consistently says that it only has information up until 2021, this even happened this year. ChatGPT can't pull from it's "memory" on this article. So the only think it can do is hallucinate something that might make sense.

Simply paste the article in and it gives a perfectly reasonable summary stating that the guy was murdered. Below is what it printed out as a summary. All I did was type a sentences asking to to summarize the following article and then I pasted in the content of the article you linked [1]. This was it's summary:

> A New Jersey community is mourning after a senior distribution supervisor and councilman was shot dead by an employee outside his workplace. Police called to the scene found 51-year-old Russell Heller dead from a gunshot wound in the parking lot of the PSE&G facility. The shooter, a former employee identified as 58-year-old Gary Curtis, was later found dead from a self-inflicted gunshot wound. Russell Heller was first elected to the council in 2017 and again in 2020 and was remembered as a perfect gentleman and committed councilman who was deeply rooted in the community. This was the second councilperson to die by gun violence within a week in New Jersey.

A completely reasonable and to my eyes an accurate summary.

And if done on the other crappy Newsbuster article it also produces a completely reasonable summary.

I'm not certain which it is - are there people who don't know that ChatGPT doesn't have current news in it and was cut off two years back? I see a long post above about some big censorship, but it summarizes them just fine.

It really feels like a lot of people are breathlessly looking for some huge conspiracy. No large corporation is going to have it's products promoting rape or genocide. If you asked Google, or Amazon or Apple or Microsoft or Disney they aren't going to do it. If they produce a tool their tool isn't going to do it either. They're going to do as much as possible to provide info and answers without having a tool that is another instance of Tay. And given what happened with Tay they will all err on the side of caution.

[1] - https://abc7ny.com/russell-heller-nj-councilman-shot-shootin...

shrimp_emoji|3 years ago

Haha, whitewashing history. We've trained it well. ;p