TorchLean API

NN.Runtime.Autograd.Compiled.Core

Compiled Core #

Proof-compiled execution path.

This module exposes the "approach (a)" workflow:

  1. Build an executable SSA/DAG graph (Proofs.Autograd.Algebra.GraphData).
  2. Compile it to a runtime Tape with compileAuxData.
  3. Run Tape.backwardDenseFrom / Tape.backwardDenseAll.

Notes / trust boundaries:

Reading map #

See also:

@[reducible, inline]

Executable SSA/DAG graph for the proof-compiled pipeline.

This is Proofs.Autograd.Algebra.GraphData specialized to:

  • Δ := Unit (no extra opaque environment threaded through evaluation), and
  • the Runtime.Autograd.Compiled namespace.
Instances For
    @[reducible, inline]

    Typed list of tensors whose shapes are tracked in a type-level List Shape.

    This is the primary "context" representation for the compiled path: graph evaluation produces a TList α (Γ ++ ss) containing all intermediate values.

    Instances For
      def Runtime.Autograd.Compiled.compile {α : Type} [DecidableEq Spec.Shape] {Γ ss : List Spec.Shape} (g : GraphData α Γ ss) (x : TList α Γ) :
      Tape α × TList α (Γ ++ ss)

      Compile an executable GraphData into a runtime eager tape.

      This is the bridge from the proof-compiled SSA representation to the runtime tape engine: Graph.compileAuxData emits a Runtime.Autograd.Tape whose nodes replay the graph and whose backward closures implement the graph's VJP rules.

      PyTorch comparison: conceptually similar to the front half of torch.compile / TorchDynamo (tracing a computation to an IR), except our target is an explicit autograd tape for which we also maintain proof links.

      Instances For

        Convention for the output node id in a compiled tape.

        compile places the original inputs first (Γ.length nodes), then appends one node per element of ss. The final output is therefore at index Γ.length + ss.length - 1.

        Invariant: callers should only use this when ss is nonempty. For defensive code, prefer checkedOutId, which reports an error instead of relying on Nat's saturating subtraction.

        Instances For

          Checked version of outId.

          Compiled scalar/output programs should always have at least one produced node. This helper makes that precondition executable, which is friendlier for user-facing compiled APIs and tests.

          Instances For

            Run reverse-mode backprop on a compiled tape, returning gradients for all node ids.

            This uses the "total/dense" variant Tape.backwardDenseAll, seeding the output gradient with 1 (the usual scalar-loss convention).

            Instances For

              Run reverse-mode backprop starting from an explicit seed gradient context.

              This is the most general entry point: callers provide a TList of initial gradients for every value in the compiled context (Γ ++ ss), and we run the proof-friendly dense loop Tape.backwardDenseFrom.

              Instances For