The reason Stan, Nimble, PyMC fit via simulation is because joint distributions generally don’t have an analytical form and require global fitting methods to identify.
I’d hazard to guess that if this package fits incrementally without simulation that either it doesn’t aim for a global optimum or that it places substantial limits on the form models can take.
Belief propagagation (i.e. message passing) has a long history and has been pretty successful. It's not always applicable, but it can have a lot of advantages when it is.
There inevitable drawbacks of both approaches. Nothing particularly wrong with the simulation, but it is really really slow and is not applicable in real-time or close to real-time applications, where situation may change rapidly. Also it usually requires high-end CPUs, which are not available on edge devices like drones.
RxInfer tries to be as efficient as possible in running inference, but it does indeed places limits on the form of the models it can run inference for. The team has long-term plans and solutions under research, which should expand the set of available models substantially. Message passing has a long history, it is not only belief propagation on simple models, but also nowadays the variational inference is possible with message passing on sophisticated hierarchical models.
I would use one of those for scenarios when I need proper measures of uncertainty like credible regions.
The methods listed in this project (EM algorithm, VI) tend not to provide this as well but usually produce something that looks like a good posterior mean (global optimum under mean square loss) in my experience.
What’s your favourite MCMC method/technique atm? I know this can be quite domain dependent, but curious. From my experience, it can seem more of an art than a science at times.
It includes a really mature compiler that generates very efficient message passing and variational inference, with support for online inference, which is the main focus on Rxinfer.
You can call Infer.NET from Python in a number of ways, despite it is not a CPython library.
[+] [-] usgroup|2 years ago|reply
I’d hazard to guess that if this package fits incrementally without simulation that either it doesn’t aim for a global optimum or that it places substantial limits on the form models can take.
[+] [-] mjhay|2 years ago|reply
[+] [-] bvdmitri|2 years ago|reply
RxInfer tries to be as efficient as possible in running inference, but it does indeed places limits on the form of the models it can run inference for. The team has long-term plans and solutions under research, which should expand the set of available models substantially. Message passing has a long history, it is not only belief propagation on simple models, but also nowadays the variational inference is possible with message passing on sophisticated hierarchical models.
[+] [-] ckrapu|2 years ago|reply
The methods listed in this project (EM algorithm, VI) tend not to provide this as well but usually produce something that looks like a good posterior mean (global optimum under mean square loss) in my experience.
[+] [-] fulafel|2 years ago|reply
[+] [-] ckrapu|2 years ago|reply
[+] [-] maizeq|2 years ago|reply
[+] [-] estebarb|2 years ago|reply
[+] [-] pdsnk1|2 years ago|reply
[+] [-] philprx|2 years ago|reply
What's the equivalent in python?
[+] [-] nextos|2 years ago|reply
It includes a really mature compiler that generates very efficient message passing and variational inference, with support for online inference, which is the main focus on Rxinfer.
You can call Infer.NET from Python in a number of ways, despite it is not a CPython library.
[+] [-] jruohonen|2 years ago|reply
You might start with Stan for which Python bindings are available if I recall correctly.
[+] [-] esafak|2 years ago|reply
[+] [-] gfalcao|2 years ago|reply