top | item 33437478

(no title)

jmagoon | 3 years ago

The real issues around data bias get completely crippled by the article writers' need to make everything about culture wars. Many computer vision systems have literally nothing to do with humans, e.g. one of the most common use cases in CV at this point is defect detection in manufacturing.

The data bias issue is about having way more samples of class X than of class Y, or class Y sharing an unknown but correlated feature (medical images with labels have this problem) that the developer doesn't identify, or any other number of "biases", like all the images being too bright, or taken with a camera that isn't identical to the one that's going to get deployed in production, etc., etc.

There are real issues that can be fixed / engineered / understood in terms of producing reliable output, and xAI absolutely helps with that! But for whatever reason journalists don't seem to understand that these systems need normal engineering safeguards, like any automated system, and bring it back to one poorly engineered model to talk about Big Bad Racist AI always denying loans based on race.

discuss

order

No comments yet.