(no title)
wespiser_2018 | 2 years ago
IMO, this is where Bayesian Statistics is far superior. There's a Curry-Howard isomorphism to logic which runs extremely deep, and it's possible to introduce using conjugate distributions with nice closed form analytical solutions. Anything more complex, well, that's what computers are for, and there are great ways (STAN) to run complex distributions that are far more intricate than frequentist methods.
thefringthing|2 years ago
This is an odd way of putting it. I think it's better to say that, given some mostly uncontroversial assumptions, if one is willing to assign real number degrees of belief to uncertain claims, then Bayesian statistical inference is the only way of reasoning about those claims that's compatible with classical propositional logic.
mmplxx|2 years ago
zozbot234|2 years ago
nextos|2 years ago
Generative models, (implemented in e.g. Stan, PyMC, Pyro, Turing, etc.) split models from inference. So one can switch from maximum likelihood to variational inference or MCMC quite easily.
Generative models, beginning from regression, make a lot more sense to students and yield much more robust inference. Most people I know who publish research articles on a frequent basis do not know p-values are not a measure of effect sizes. This demonstrates current education has failed.
eutectic|2 years ago