Parameters

Parameters in Pyro are basically thin wrappers around PyTorch Tensors that carry unique names. As such Parameters are the primary stateful objects in Pyro. Users typically interact with parameters via the Pyro primitive pyro.param. Parameters play a central role in stochastic variational inference, where they are used to represent point estimates for the parameters in parameterized families of models and guides.

ParamStore

class ParamStoreDict[source]

Bases: object

Global store for parameters in Pyro. This is basically a key-value store. The typical user interacts with the ParamStore primarily through the primitive pyro.param.

See Intro Part II for further discussion and SVI Part I for some examples.

Some things to bear in mind when using parameters in Pyro:

  • parameters must be assigned unique names
  • the init_tensor argument to pyro.param is only used the first time that a given (named) parameter is registered with Pyro.
  • for this reason, a user may need to use the clear() method if working in a REPL in order to get the desired behavior. this method can also be invoked with pyro.clear_param_store().
  • the internal name of a parameter within a PyTorch nn.Module that has been registered with Pyro is prepended with the Pyro name of the module. so nothing prevents the user from having two different modules each of which contains a parameter named weight. by contrast, a user can only have one top-level parameter named weight (outside of any module).
  • parameters can be saved and loaded from disk using save and load.
clear()[source]

Clear the ParamStore

items()[source]

Iterate over (name, constrained_param) pairs.

keys()[source]

Iterate over param names.

values()[source]

Iterate over constrained parameter values.

setdefault(name, init_constrained_value, constraint=Real())[source]

Retrieve a constrained parameter value from the if it exists, otherwise set the initial value. Note that this is a little fancier than dict.setdefault().

If the parameter already exists, init_constrained_tensor will be ignored. To avoid expensive creation of init_constrained_tensor you can wrap it in a lambda that will only be evaluated if the parameter does not already exist:

param_store.get("foo", lambda: (0.001 * torch.randn(1000, 1000)).exp(),
                constraint=constraints.positive)
Parameters:
Returns:

constrained parameter value

Return type:

torch.Tensor

named_parameters()[source]

Returns an iterator over (name, unconstrained_value) tuples for each parameter in the ParamStore.

get_all_param_names()[source]
replace_param(param_name, new_param, old_param)[source]
get_param(name, init_tensor=None, constraint=Real(), event_dim=None)[source]

Get parameter from its name. If it does not yet exist in the ParamStore, it will be created and stored. The Pyro primitive pyro.param dispatches to this method.

Parameters:
Returns:

parameter

Return type:

torch.Tensor

match(name)[source]

Get all parameters that match regex. The parameter must exist.

Parameters:name (str) – regular expression
Returns:dict with key param name and value torch Tensor
param_name(p)[source]

Get parameter name from parameter

Parameters:p – parameter
Returns:parameter name
get_state()[source]

Get the ParamStore state.

set_state(state)[source]

Set the ParamStore state using state from a previous get_state() call

save(filename)[source]

Save parameters to disk

Parameters:filename (str) – file name to save to
load(filename, map_location=None)[source]

Loads parameters from disk

Note

If using pyro.module() on parameters loaded from disk, be sure to set the update_module_params flag:

pyro.get_param_store().load('saved_params.save')
pyro.module('module', nn, update_module_params=True)
Parameters:
  • filename (str) – file name to load from
  • map_location (function, torch.device, string or a dict) – specifies how to remap storage locations
param_with_module_name(pyro_name, param_name)[source]
module_from_param_with_module_name(param_name)[source]
user_param_name(param_name)[source]