top | item 40963412

(no title)

t_serpico | 1 year ago

"Topology is all that matters" --> bold statement, especially when you read the paper. The original authors were much more reserved in terms of their conclusions.

discuss

order

griffzhowl|1 year ago

Yes, on its face it looks like he's saying that you can throw out the weights of any network and still expect the same or similar behaviour, which is obviously false. It's also contradicted in that very section where he reports from the cited paper that randomized parameters reproduced the desired behaviour in about 1 in 200 cases. All these cases have the same network topology so while that might be higher than expected probability for retaining function with randomized paramteres (over 2-3 orders of magnitude), it's also a clear demonstration that more than topology is significant

rdlecler1|1 year ago

The topology needs to be information bearing. Weights of 0.0001 are likely spurious and if other weights are so relatively big they can effectively make the other fan in weights spurious as well.

rdlecler1|1 year ago

The original papers were published in scientific journals. More assertive claims aren’t kosher.