Quickstart: Proving Small TorchLean Facts #
TorchLean examples are not only executable scripts. Many guarantees are ordinary Lean theorems: shape round-trips, typed tensor construction, activation identities, and later full verification statements.
This file is intentionally small. It shows the two proof styles a new user should recognize first:
- compile-time guarantees from shape-indexed tensor types, and
- ordinary mathematical lemmas about the public API.
The deeper proof libraries live under NN.Proofs.*, NN.Verification.*, and NN.MLTheory.*.
A tensor's shape is part of its type.
If this definition compiles, Lean has already checked that the literal has exactly two entries and
therefore really is a Vec 2. The commented shape-mismatch below is the kind of bug Lean catches
before runtime:
-- def badVector : Tensor Float (shape![3]) := tensor! [1.0, 2.0]
Instances For
Runtime dimension lists can still be related back to static TorchLean shapes.
This is the small theorem behind many JSON/CLI/data-loader paths: parse dimensions dynamically,
then recover the precise Shape used by the typed tensor API.
The host-side public ReLU agrees with the usual mathematical identity on nonnegative real inputs.
This is deliberately a small theorem, but it has the same form as larger library facts: state the semantic contract once, prove it in Lean, and use it downstream without trusting comments or tests.
ReLU clamps nonpositive real inputs to zero.