networks#

A rich collection of neural network architectures for use in Approximators.

Classes

ConsistencyModel(*args, **kwargs)

Implements a Consistency Model with Consistency Training (CT) a described in [1-2].

CouplingFlow(*args, **kwargs)

Implements a coupling flow as a sequence of dual couplings with permutations and activation normalization.

DeepSet(*args, **kwargs)

Implements a deep set encoder introduced in [1] for learning permutation-invariant representations of set-based data, as generated by exchangeable models.

FlowMatching(*args, **kwargs)

Implements Optimal Transport Flow Matching, originally introduced as Rectified Flow, with ideas incorporated from [1-3].

FusionTransformer(*args, **kwargs)

Implements a more flexible version of the TimeSeriesTransformer that applies a series of self-attention layers followed by cross-attention between the representation and a learnable template summarized via a recurrent net.

InferenceNetwork(*args, **kwargs)

MLP(*args, **kwargs)

Implements a simple configurable MLP with optional residual connections and dropout.

PointInferenceNetwork(*args, **kwargs)

Implements point estimation for user specified scoring rules by a shared feed forward architecture with separate heads for each scoring rule.

SetTransformer(*args, **kwargs)

Implements the set transformer architecture from [1] which ultimately represents a learnable permutation-invariant function.

SummaryNetwork(*args, **kwargs)

TimeSeriesNetwork(*args, **kwargs)

Implements a LSTNet Architecture as described in [1]

TimeSeriesTransformer(*args, **kwargs)

Creates a regular transformer coupled with Time2Vec embeddings of time used to flexibly compress time series.