TorchLean API

NN.Spec.Module.PositionalEncoding

PositionalEncoding #

Module wrappers for spec-layer positional encodings.

This is the simplest learnable variant: add a (seqLen, embedDim) parameter tensor.

PyTorch equivalent: "learnable positional embedding" that is added to token embeddings. In practice this is often implemented via nn.Embedding(seqLen, embedDim) and an index arange; here we treat the positional tensor itself as the parameter.

def Spec.PositionalEncodingModuleSpec {α : Type} [Context α] {seqLen embedDim : } (pe : PositionalEncodingSpec seqLen embedDim α) :

Learnable positional encoding wrapper (adds a (seqLen,embedDim) parameter tensor).

Instances For