top | item 41750086

(no title)

superluserdo | 1 year ago

It is and has been for a while, but most of the more flashy and exciting developments in ML and AI don't have very much applicability to LHC event processing. To be able to state any kind of finding about some aspect of physics based on the scattering of particles in the accelerator and their decays in the detector, you need to take the background of all events and make multivariate discriminants on the data in order to enrich your signal as much as possible while throwing as little as possible away. This requires you to have a rigorous and verifiable statistical "paper trail" from start to finish, so you can say with confidence intervals how much signal and background you ought to have, vs how much you measure in your data after processing it. An overly broad black box doesn't really work for this kind of introspection.

discuss

order

No comments yet.