summary#
Neural networks for learning maximally informative compressions of data modalities such as images, timeseries, sets and combinations thereof.
Classes
|
A convolutional summary network with residual blocks. |
|
(SN) Implements a deep set encoder introduced in [1] for learning permutation-invariant representations of set-based data, as generated by exchangeable models. |
|
(SN) Wraps multiple summary networks (backbones) to learn summary statistics from multi-modal data. |
|
(SN) Implements a more flexible version of the TimeSeriesTransformer that applies a series of self-attention layers followed by cross-attention between the representation and a learnable template summarized via a recurrent net. |
|
(SN) Implements the set transformer architecture from [1] which ultimately represents a learnable permutation-invariant function. |
|
Abstract base class for all summary networks in BayesFlow. |
|
(SN) Implements a LSTNet Architecture as described in [1] |
|
(SN) Creates a regular transformer coupled with Time2Vec embeddings of time used to flexibly compress time series. |