Linear module wrappers #
NN/Spec/Layers/Linear.lean defines the math-level linear layer spec (parameters + gradients).
This file provides NNModuleSpec wrappers so linear layers can be:
- composed in a
SpecChainwith compile-time shape checking, and - annotated with lightweight export metadata (PyTorch pretty-printing).
These wrappers are thin on purpose: the meaning is still the underlying linear_spec; the extra
fields are just metadata.
Most readers can map these directly to PyTorch: LinearModuleSpec is nn.Linear, and the sequence
helpers are the common pattern "apply a linear layer to each timestep, then maybe select the last
step for classification".
Wrap LinearSpec forward as an NNModuleSpec.
Instances For
Apply a linear layer independently at each timestep (sequence lift).
Instances For
Extract the last timestep of a sequence (requires seqLen ≠ 0).
In PyTorch terms: seq[-1] when seq has shape (seqLen, hiddenSize).
Instances For
Sequence classifier: take last timestep, then apply a linear projection to numClasses.
Instances For
Example: Linear → Linear composition via SpecChain.