InferenceΒΆ
In the context of probabilistic modeling, learning is usually called inference. In the particular case of Bayesian inference, this often involves computing (approximate) posterior distributions. In the case of parameterized models, this usually involves some sort of optimization. Pyro supports multiple inference algorithms, with support for stochastic variational inference (SVI) being the most extensive. Look here for more inference algorithms in future versions of Pyro.
See the Introductory tutorial for a discussion of inference in Pyro.
- SVI
- ELBO
- Importance
- Reweighted Wake-Sleep
- Sequential Monte Carlo
- Stein Methods
- Likelihood free methods
- Discrete Inference
- Prediction utilities
- MCMC
- Automatic Guide Generation
- AutoGuide
- AutoGuideList
- AutoCallable
- AutoNormal
- AutoDelta
- AutoContinuous
- AutoMultivariateNormal
- AutoDiagonalNormal
- AutoLowRankMultivariateNormal
- AutoNormalizingFlow
- AutoIAFNormal
- AutoLaplaceApproximation
- AutoDiscreteParallel
- AutoStructured
- AutoGaussian
- AutoMessenger
- AutoNormalMessenger
- AutoHierarchicalNormalMessenger
- AutoRegressiveMessenger
- Initialization
- Reparameterizers
- Automatic Strategies
- Conjugate Updating
- Loc-Scale Decentering
- Gumbel-Softmax
- Transformed Distributions
- Discrete Cosine Transform
- Haar Transform
- Unit Jacobian Transforms
- StudentT Distributions
- Stable Distributions
- Projected Normal Distributions
- Hidden Markov Models
- Site Splitting
- Neural Transport
- Structured Preconditioning
- Inference utilities