top | item 36099065

(no title)

joshjdr | 2 years ago

I actually agree more with this comment more than after my initial read. You suggest some valid concerns about innovation that regulation could address.

I guess the part I’m unsure about is the assertion about the dissimilarity to Photoshop, or if the marketing is the issue at hand. (E.g. did Adobe do a more appropriate job marketing with respect to conveying that their software is designed for the editing, but not doctoring, or falsifying facts?)

discuss

order

majormajor|2 years ago

I think ChatGPT and Photoshop are both "designed for" the creation of novel things.

In Photoshop, though, the intent is clearly up to the user. If you edit that photo, you know you're editing the photo.

That's fairly different than ChatGPT where you ask a question and this product has been trained to answer you in a highly-confident way that makes it sound like it actually knows more than it does.

joshjdr|2 years ago

If we’re moving past the marketing questions/concerns, I’m not sure I agree.

For me, for now, ChatGPT remains a tool/resource, like: Google, Wikipedia, Photoshop, Adaptive Cruise Control, and Tesla FSD, (e.g. for the record despite mentioning FSD, I don’t think anyone should ever take a nap while operating a vehicle with any currently available technology).

Did I miss when OpenAI marketed ChatGPT as a truthful resource for legal matters?

Or is this not just an appropriate story that deserves retelling to warn potential users about how not to misappropriate this technology?

At the end of the day, for an attorney, a legal officer of the court, to have done this is absolutely not the technology’s, nor marketing’s, fault.