Dropout as NNModuleSpecs (deterministic spec variants) #
PyTorch's dropout is stochastic during training and becomes identity during evaluation. In the spec layer we often want a deterministic, pure meaning that can be composed into models and used in proofs without introducing randomness.
This file wraps the deterministic dropout specs from NN/Spec/Layers/Dropout.lean as
NNModuleSpecs
so they can be used in SpecChain pipelines and carry export metadata.
Two variants are provided:
DropoutInferenceModuleSpec p: a deterministic "scale by keep probability" variant.DropoutMaskedModuleSpec p mask: a deterministic "training-style" dropout that takes the mask explicitly (useful when you want to model a particular dropout pattern).
Deterministic inference-time dropout wrapper.
This is a pure scaling op (x ↦ (1 - p) * x). It is a deterministic surrogate used in some spec
models rather than PyTorch's running training/eval state.
Instances For
Deterministic masked dropout wrapper (mask is captured as data).
This matches the usual training-time dropout structure with the mask made explicit instead of
sampled. The forward uses the same scaling and epsilon-protection as dropout_masked_spec.