top | item 41758635

(no title)

yldedly | 1 year ago

I think a much closer analogy to function inversion is MCMC (or Bayesian inference in general), where we can easily compute the density p(x) of any point x, but finding the x given a p(x) is intractable. Strictly speaking it's about finding a set of x's that are distributed as p(X), not finding the x given any single density p(x), but it's close.

Relatedly, probabilistic programming was originally imagined pretty much like your second quote: you define a model, get some data, run them both through the built-in inference engine, and you get the parameters of the model likely to have produced the data. In practice though, there's no universal inference engine that works for everything (some people disagree, but they're NUTS ;) I guess pretty much for the same reason P is probably not equal to NP.

discuss

order

vasekrozhon|1 year ago

> I guess pretty much for the same reason P is probably not equal to NP.

Yep, in particular there are classes called #P and PP that are closely connected to NP that can capture the hardness of problems like computing partition functions, sampling from posterior distribution and so on.

yldedly|1 year ago

Didn't know about these, thanks for the pointer! Do you have a good resource for learning about these (specifically about the hardness of sampling from posterior distributions)?