top | item 32035613

(no title)

P-NP | 3 years ago

They are listed here: https://people.idsia.ch/~juergen/scientific-integrity-turing...

discuss

order

random314|3 years ago

These are some examples of non citations in talks (not papers), not plagiarism. I will discuss a representative example here.

> 9. LBH claim ReLUs enabled deep learning to outperform previous methods for object recognition, referring to their GPU-based ImageNet 2012 winner called AlexNet,[GPUCNN4] without mentioning that our earlier groundbreaking deep GPU-based DanNet[GPUCNN1-3,5-8][DAN] did not need ReLUs at all to win 4 earlier object recognition competitions and to achieve superhuman results already in 2011[GPUCNN1-8][R5-6] (see Sec. XIV).

If we click and look into the details, Alexnet won imagenet - a general purpose image recognition dataset. Whereas Dannet worked on specific domains- Chinese handwriting recognition, mitosis etc. So Dannet is not comparable in impact to Alexnet at all. ReLUs are in all complex DNNs now - wouldn't have happened if ReLUs are redundant as implied by Schmidhuber.

https://people.idsia.ch/~juergen/computer-vision-contests-wo...

P-NP|3 years ago

You conveniently left out the plagiarism part regarding ReLUs:

> 8. LBH devote an extra section to rectified linear units (ReLUs), citing papers of the 2000s by Hinton and his former students, without citing Fukushima who introduced ReLUs in 1969[RELU1-2] (see Sec. XIV).

This is only one of many concrete examples given.

DanNet obviously worked on all kinds of image data, otherwise it would not have won all those competitions before the similar AlexNet. However, the CNN pioneer was Fukushima who introduced CNNs and ReLUs.