bayesflow.wrappers module#

class bayesflow.wrappers.SpectralNormalization(*args, **kwargs)[source]#

Bases: Wrapper

Performs spectral normalization on neural network weights. Adapted from:

This wrapper controls the Lipschitz constant of a layer by constraining its spectral norm, which can stabilize the training of generative networks.

See Spectral Normalization for Generative Adversarial Networks](

__init__(layer, power_iterations=1, **kwargs)[source]#

Build Layer

call(inputs, training=False)[source]#

Call Layer

inputstf.Tensor of shape (None,…,condition_dim + target_dim)

The inputs to the corresponding layer.


Generate spectral normalized weights.

This method will update the value of self.w with the spectral normalized value, so that the layer is ready for call().


Returns the config of the layer.

A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Network (one layer of abstraction above).

Note that get_config() does not guarantee to return a fresh copy of dict every time it is called. The callers should make a copy of the returned dict if they want to modify it.


Python dictionary.