top | item 28272930

(no title)

verygoodname | 4 years ago

This.

The thing is that some of the techniques commonly applied when training NN are often "good enough" to deal with the presence of corrupted data (e.g. using SGD to optimize a model, while applying weight decay and drop-out, adds a regularization effect that somewhat replicates the effect of assuming errors-in-variables), as long as the input data is not total trash, which deters people from applying more formalized robust approaches to it.

As long as "things kind of work", it is difficult to convince other people to adopt robust methods, particularly due to the existence of a "robustness vs. efficiency" trade-off (which can make robust methods seem additionally "unsexy").

discuss

order

No comments yet.