Quickstart: Autograd Basics #
Tour of the public autograd APIs beyond .backward():
API.autograd.model.*for model-level VJP / jacobian / loss gradients.API.autograd.model.OutputLoss.*for reusable scalar losses on model outputs.API.autograd.fn1.*for Jacobian / Hessian helpers on single-input tensor functions.API.nn.functional.detachfor stop-gradient behavior.
Run:
lake exe torchlean quickstart_autograd --dtype float --backend eager
lake exe torchlean quickstart_autograd --dtype float32 --backend compiled
This file is a curated "autodiff API tour". It intentionally avoids:
- low-level runtime tape/session code,
- hand-written parameter-shape bookkeeping,
- noisy
castTensorhelpers.
Instead, it uses the public API.autograd.* facade and tensorF! cast ... to build deterministic
constants for any runtime scalar α.
Instances For
Instances For
Instances For
Instances For
Instances For
def
NN.Examples.Quickstart.AutogradBasics.runOnce
{α : Type}
[API.Semantics.Scalar α]
[DecidableEq Spec.Shape]
[ToString α]
(cast : Float → α)
: