Torch-style runtime front-end #
This is the public umbrella for the low-level PyTorch-style runtime layer.
The split is intentional:
Torch.Coredefines imperative tensor references, parameters, eager sessions, operation wrappers, compiled scalar/output wrappers, and simple scalar trainers.Torch.LinkedSessionrecords the same style of imperative computation into the provedGraphDataIR and exposes the theorem connecting compiled runtime backprop to proved graph backprop.Torch.Utilscontains small demo/training conveniences such as deterministic initializers, small sample builders, and trainer loops.
TorchLean/* builds the higher-level model/program API on top of this layer. So Torch is the
low-level session/ref bridge; TorchLean is the nicer user-facing model stack.