TorchLean API

NN.Spec.Module.Dropout

Dropout as NNModuleSpecs (deterministic spec variants) #

PyTorch's dropout is stochastic during training and becomes identity during evaluation. In the spec layer we often want a deterministic, pure meaning that can be composed into models and used in proofs without introducing randomness.

This file wraps the deterministic dropout specs from NN/Spec/Layers/Dropout.lean as NNModuleSpecs so they can be used in SpecChain pipelines and carry export metadata.

Two variants are provided:

Deterministic inference-time dropout wrapper.

This is a pure scaling op (x ↦ (1 - p) * x). It is a deterministic surrogate used in some spec models rather than PyTorch's running training/eval state.

Instances For
    def Spec.DropoutMaskedModuleSpec {α : Type} [Context α] {s : Shape} (p : α) (mask : Tensor Bool s) :

    Deterministic masked dropout wrapper (mask is captured as data).

    This matches the usual training-time dropout structure with the mask made explicit instead of sampled. The forward uses the same scaling and epsilon-protection as dropout_masked_spec.

    Instances For