(no title)
jongala | 3 years ago
Computer, generate a list of applications where technology like this should absolutely not be used.
This seems terrifying — wouldn't this, for example, synthesize identifying details about motor vehicles not present in the sample, drawn from training data? About people? Etc etc
This sounds awesome for making a consumer mapping product feel higher quality. Hell, it could even serve an anonymizing function that is pro-privacy ("a roof" not "your roof"). But it feels incredibly reckless to direct this toward those listed industries.
People in surveillance and forensics etc. should be confronted with the limits of the quality of the data they are using, we should not try to synthesize extra confidence in their analysis by making the images seem higher quality than they are.
No comments yet.