pyzag.stochastic
Tools for converting deterministc models implemented in pytorch to stochastic models
- class pyzag.stochastic.HierarchicalStatisticalModel(base, parameter_mapper, noise_prior, update_mask=False)
Converts a torch model over to being a Pyro-based hierarchical statistical model
Eventually the plan is to let the user provide a dictionary instead of a single parameter_mapper
- Parameters:
base (torch.nn.Module) – base torch module
parameter_mapper (MapParameter) – mapper class describing how to convert from Parameter to Distribution
noise_prior (float) – scale prior for white noise
- Keyword Arguments:
update_mask (bool) – if True, update the mask to remove samples that are not valid
- forward(*args, results=None, weights=None, **kwargs)
Class the base forward with the appropriate args
- Parameters:
*args – whatever arguments the underlying model needs. But at least one must be a tensor so we can infer the correct batch shapes!
- Keyword Arguments:
results (torch.tensor or None) – results to condition on
weights (torch.tensor or None) – weights on the results, default all ones
- class pyzag.stochastic.MapNormal(cov, loc_suffix='_loc', scale_suffix='_scale')
A map between a deterministic torch parameter and a two-scale normal distribution
- Parameters:
cov – coefficient of variation used to define the scale priors
- Keyword Arguments:
sep (str) – seperator character in names
loc_suffix – suffix to add to parameter name to give the upper-level distribution for the scale
scale_suffix – suffix to add to the parameter name to give the lower-level distribution for the scale