top | item 46375907 (no title) xgulfie | 2 months ago I hate this. Why not show us the actual photo. Infuriating discuss order hn newest stronglikedan|2 months ago Technically they do, just split into chunks interleaved with AI generated background. Original: https://en.wikipedia.org/wiki/Priscilla_Norman#/media/File:L... bondarchuk|2 months ago >just split into chunks interleaved with AIThat would be relatively benign, the other possibility is that the whole thing was encoded and then decoded through some neural representation. agumonkey|2 months ago I wonder how we'll deal with the inability to tell what's true or not in the coming years. Even without full deepfakes.. just a gradual hypothetical restoration turning subtle hallucination in many many places. hombre_fatal|2 months ago Judging by how little people care about the veracity of claims made on social media and youtube (long before AI), not much will change.The root problem is that we don’t have very robust epistemic standards. We mostly go by vibes and what we want to be true. load replies (1)
stronglikedan|2 months ago Technically they do, just split into chunks interleaved with AI generated background. Original: https://en.wikipedia.org/wiki/Priscilla_Norman#/media/File:L... bondarchuk|2 months ago >just split into chunks interleaved with AIThat would be relatively benign, the other possibility is that the whole thing was encoded and then decoded through some neural representation.
bondarchuk|2 months ago >just split into chunks interleaved with AIThat would be relatively benign, the other possibility is that the whole thing was encoded and then decoded through some neural representation.
agumonkey|2 months ago I wonder how we'll deal with the inability to tell what's true or not in the coming years. Even without full deepfakes.. just a gradual hypothetical restoration turning subtle hallucination in many many places. hombre_fatal|2 months ago Judging by how little people care about the veracity of claims made on social media and youtube (long before AI), not much will change.The root problem is that we don’t have very robust epistemic standards. We mostly go by vibes and what we want to be true. load replies (1)
hombre_fatal|2 months ago Judging by how little people care about the veracity of claims made on social media and youtube (long before AI), not much will change.The root problem is that we don’t have very robust epistemic standards. We mostly go by vibes and what we want to be true. load replies (1)
stronglikedan|2 months ago
bondarchuk|2 months ago
That would be relatively benign, the other possibility is that the whole thing was encoded and then decoded through some neural representation.
agumonkey|2 months ago
hombre_fatal|2 months ago
The root problem is that we don’t have very robust epistemic standards. We mostly go by vibes and what we want to be true.