Standardize#
- class bayesflow.adapters.transforms.Standardize(mean: int | float | ndarray = None, std: int | float | ndarray = None, axis: int = None, momentum: float | None = 0.99)[source]#
Bases:
ElementwiseTransform
Transform that when applied standardizes data using typical z-score standardization i.e. for some unstandardized data x the standardized version z would be
>>> z = (x - mean(x)) / std(x)
- Parameters:
- meanint or float, optional
Specify a mean if known but will be estimated from data when not provided
- stdint or float, optional
Specify a standard devation if known but will be estimated from data when not provided
- axisint, optional
A specific axis along which standardization should take place. By default standardization happens individually for each dimension
- momentumfloat in (0,1)
The momentum during training
Examples
Standardize all variables using their individually estimated mean and stds.
>>> adapter = ( bf.adapters.Adapter() .standardize() )
Standardize all with same known mean and std.
>>> adapter = ( bf.adapters.Adapter() .standardize(mean = 5, sd = 10) )
3) Mix of fixed and estimated means/stds. Suppose we have priors for “beta” and “sigma” where we know the means and stds. However for all other variables, the means and stds are unknown. Then standardize should be used in several stages specifying which variables to include or exclude.
>>> adapter = ( bf.adapters.Adapter() # mean fixed, std estimated .standardize(include = "beta", mean = 1) # both mean and SD fixed .standardize(include = "sigma", mean = 0.6, sd = 3) # both means and stds estimated for all other variables .standardize(exclude = ["beta", "sigma"]) )
- classmethod from_config(config: dict, custom_objects=None) Standardize [source]#