TorchLean API

NN.API.Models.TrainFixed

Fixed-Sample Training Helpers (API) #

Many runnable examples in NN/Examples/Models/* follow the same pattern:

  1. build a model with nn.withModel,
  2. wrap it as a ScalarModuleDef (model + supervised loss),
  3. load or synthesize one supervised sample (x, y),
  4. run steps optimizer updates on that fixed sample, and
  5. either print loss0 -> loss1 or write a TrainLog curve.

This module keeps that loop in one place so examples stay short and consistent.

What this is (and is not):

Before/after scalar losses for a fixed-sample training run.

  • loss0 : α
  • loss1 : α
Instances For
    Instances For
      @[implicit_reducible]
      def NN.API.Models.TrainFixed.steps {α : Type} [Semantics.Scalar α] [DecidableEq Shape] [ToString α] [Runtime.Scalar α] [Runtime.Autograd.Torch.Internal.CudaBridge.TensorConv α] {σ τ : Shape} (mkModel : nn.M (nn.Sequential σ τ)) (mkModuleDef : (model : nn.Sequential σ τ) → Runtime.Autograd.TorchLean.ScalarModuleDef (TorchLean.NN.Seq.paramShapes model) [σ, τ]) (mkOptim : (Floatα)(paramShapes : List Shape) → Runtime.Autograd.TorchLean.Optimizer α paramShapes) (cast : Floatα) (opts : Runtime.Autograd.Torch.Options) (sample : sample.Supervised α σ τ) (steps : ) :

      One fixed-sample run for an arbitrary scalar backend.

      Instances For

        Fixed-sample run specialized to Float, returning a full per-step curve.

        Instances For