bayesflow.wrappers module#

class bayesflow.wrappers.SpectralNormalization(*args, **kwargs)[source]#

Bases: Wrapper

Performs spectral normalization on neural network weights. Adapted from:

https://www.tensorflow.org/addons/api_docs/python/tfa/layers/SpectralNormalization

This wrapper controls the Lipschitz constant of a layer by constraining its spectral norm, which can stabilize the training of generative networks.

See Spectral Normalization for Generative Adversarial Networks](https://arxiv.org/abs/1802.05957).

__init__(layer, power_iterations=1, **kwargs)[source]#
build(input_shape)[source]#

Build Layer

call(inputs, training=False)[source]#

Call Layer

Parameters:
inputstf.Tensor of shape (None,…,condition_dim + target_dim)

The inputs to the corresponding layer.

normalize_weights()[source]#

Generate spectral normalized weights.

This method will update the value of self.w with the spectral normalized value, so that the layer is ready for call().

get_config()[source]#

Returns the config of the layer.

A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Network (one layer of abstraction above).

Note that get_config() does not guarantee to return a fresh copy of dict every time it is called. The callers should make a copy of the returned dict if they want to modify it.

Returns:

Python dictionary.