top | item 23722114

(no title)

max_likelihood | 5 years ago

I know very little about CNNs. But, I noticed the ReLu Activation step is Max(0,x) where x is the sum of the pixel intensities from each channel. In this example, it appears x > 0 (for all x) and so the activation step isn't really doing much?

EDIT: I'm wrong. x < 0 for some of the pixels. Specifically for the more red-ish channels.

discuss

order

blackbear_|5 years ago

No, relu is applied after convolution, so x is the result of applying the kernel at a particular location of the input, so it depends on the color as well as on the kernel.