top | item 42913709

(no title)

autoconfig | 1 year ago

Either you care about being correct or you don't. If you don't care then it doesn't matter whether you made it up or the AI did. If you care then you'll fact check before publishing. I don't see why this changes.

discuss

order

azinman2|1 year ago

When things are easy, you’re going to take the easy path even if it means quality goes down. It’s about trade offs. If you had to do it yourself, perhaps quality would have been higher because you had no other choice.

Lots of kids don’t want to do homework. That said, previously many would because there wasn’t another choice. But now they can just ask ChatGPT for the answers they’ll write that down verbatim with zero learning taking place.

Caring isn’t a binary thing or works in isolation.

jstummbillig|1 year ago

I don't think it follows that taken an easier path would mean quality goes down.

hi_hi|1 year ago

Because maybe you want to, but you have a boss breathing down your neck and KPIs to meet and you haven't slept properly in days and just need a win, so you get the AI to put together some impressive looking graphs and stats that will look impressive in that client showcase thats due in a few hours.

Things aren't quite so black and white in reality.

dauhak|1 year ago

I mean those same conditions already just lead the human to cutting corners and making stuff up themselves. You're describing the problem where bad incentives/conditions lead to sloppy work, that happens with or without AI

Catching errors/validating work is obviously a different process when they're coming from an AI vs a human, but I don't see how it's fundamentally that different here. If the outputs are heavily cited then that might go someway into being able to more easily catch and correct slip-ups

spaceywilly|1 year ago

I think a lot about how differentiating facts and quality content is like differentiating signal from noise in electronics. The signal to noise ratio on many online platforms was already quite low. Tools like this will absolutely add more noise, and arguably the nature of the tools themselves make it harder to separate the noise.

I think this is a real problem for these AI tools. If you can’t separate the signal from the noise, it doesn’t provide any real value, like an out of range FM radio station.

WOTERMEON|1 year ago

Not only that: by publishing noise, you’re lowering the signal/noise ratio.

layer8|1 year ago

People are much less scrupulous using LLM output than making up stuff themselves, because then they can blame the LLM.

RainyDayTmrw|1 year ago

It's possible that you care, but the person next to you doesn't, and external pressures force you to keep up with the person who's willing to shovel AI slop. Most of us don't have a complete luxury of the moral high ground at our jobs.

navigate8310|1 year ago

It's the high reps fault then of not caring about quality. Either you assimilate in that low quality lower management using AI slop or change job.

doomroot|1 year ago

It looks like the moral high just came more in demand.

n4r9|1 year ago

It's a bit like saying "my kids are going to hit themselves anyway, so it doesn't matter if I give them foam rods or metal rods".

ctoth|1 year ago

Maybe this would make sense if you saw the whole world as "kids" that you had to protect. As an adult who lives in an adult world, I would like adults to have access to metal tools and not just foam ones.

sbarre|1 year ago

How hard it is to produce credible-looking bullshit makes a really big difference in these scenarios.

Consultants aren't the ones doing the fact-checking, that falls to the client, who ironically tend to assume the consultants did it.

michael_swift|1 year ago

don't you think the problem of checking for correctness then becomes more insidious then? we now can generate hundreds of reports that look very professional on the surface. the usual things that would tip you off that this person was careless aren't there -- typos, poor sentence construction, missing references. just more noise to pick through for signal

ADeerAppeared|1 year ago

> If you care then you'll fact check before publishing.

Doing a proper fact check is as much work as doing the entire research by hand, and therefore, this system is useless to anyone who cares about the result being correct.

> I don't see why this changes.

And because of the above this system should not exist.

mlsu|1 year ago

If 20% of people don't care about being correct, the rest of everyone can deal with that. If 80% of people don't care about being correct, the rest of us will not be able to deal with that.

Same thing as misinformation. A sufficient quantitative difference becomes a qualitative difference at some point.