Inference¶
In the context of probabilistic modeling, learning is usually called inference. In the particular case of Bayesian inference, this often involves computing (approximate) posterior distributions. In the case of parameterized models, this usually involves some sort of optimization. Pyro supports multiple inference algorithms, with support for stochastic variational inference (SVI) being the most extensive. Look here for more inference algorithms in future versions of Pyro.
See Intro II for a discussion of inference in Pyro.
SVI¶

class
SVI
(model, guide, optim, loss, loss_and_grads=None, *args, **kwargs)[source]¶ Bases:
object
Parameters:  model – the model (callable containing Pyro primitives)
 guide – the guide (callable containing Pyro primitives)
 optim (pyro.optim.PyroOptim) – a wrapper a for a PyTorch optimizer
 loss – this is either a string that specifies the loss function to be used (currently the only supported builtin loss is ‘ELBO’) or a userprovided loss function; in the case this is a builtin loss loss_and_grads will be filled in accordingly
 loss_and_grads – if specified, this userprovided callable computes gradients for use in step() and marks which parameters in the param store are to be optimized
A unified interface for stochastic variational inference in Pyro. Most users will interact with SVI with the argument loss=”ELBO”. See the tutorial SVI Part I for a discussion.
ELBO¶

class
ELBO
(num_particles=1, trace_graph=False, enum_discrete=False)[source]¶ Bases:
object
Parameters:  num_particles – the number of particles (samples) used to form the ELBO estimator.
 trace_graph – boolean. whether to keep track of dependency information when running the model and guide. this information can be used to form a gradient estimator with lower variance in the case that some of the random variables are nonreparameterized. note: for a model with many random variables, keeping track of the dependency information can be expensive. see the tutorial SVI Part III for a discussion.
 enum_discrete (bool) – whether to sum over discrete latent variables, rather than sample them.
ELBO is the toplevel interface for stochastic variational inference via optimization of the evidence lower bound. Most users will not interact with ELBO directly; instead they will interact with SVI. ELBO dispatches to Trace_ELBO and TraceGraph_ELBO, where the internal implementations live.
Warning
enum_discrete is a bleeding edge feature. see SSVAE for a discussion.
References
[1] Automated Variational Inference in Probabilistic Programming David Wingate, Theo Weber
[2] Black Box Variational Inference, Rajesh Ranganath, Sean Gerrish, David M. Blei

loss
(model, guide, *args, **kwargs)[source]¶ Evaluates the ELBO with an estimator that uses num_particles many samples/particles, where num_particles is specified in the constructor.
Returns: returns an estimate of the ELBO Return type: float

loss_and_grads
(model, guide, *args, **kwargs)[source]¶ Computes the ELBO as well as the surrogate ELBO that is used to form the gradient estimator. Performs backward on the latter. Num_particle many samples are used to form the estimators, where num_particles is specified in the constructor.
Returns: returns an estimate of the ELBO Return type: float
Importance¶

class
Importance
(model, guide=None, num_samples=None)[source]¶ Bases:
pyro.infer.abstract_infer.TracePosterior
Parameters:  model – probabilistic model defined as a function
 guide – guide used for sampling defined as a function
 num_samples – number of samples to draw from the guide (default 10)
This method performs posterior inference by importance sampling using the guide as the proposal distribution. If no guide is provided, it defaults to proposing from the model’s prior.