NF Runtime Approximation Backend #
Proof-relevant rounded tensor/operator approximation for NF.
NF wraps real values and inserts an explicit neural_round step after primitive arithmetic. The
modules collected here prove local bounds for elementwise ops, reductions, shape-only ops, linear
algebra, Conv2D forward/backward, and graph-level end-to-end execution.
File roles:
Ops: scalar and elementwise tensor bounds, plus primitiveFwdNodeconstructors.Linalg: matrix/vector and matrix/matrix multiplication bounds.ReductionOps: row/column reductions used by normalization and attention.ShapeOps: value-preserving tensor rearrangements such as replication/broadcasting.BackwardOps: VJP bounds andRevNodeconstructors for reverse-mode composition.Conv,ConvForward,ConvBackward: Conv2D shared error algebra plus forward/backward bounds.EndToEnd: bridge from proof-levelRevGraphtheorems to executableGraphData.SoftmaxAxis: documentation separating scalar softmax/logistic bounds from vector-valued vector-valued axis softmax bounds.Utils: shared list-fold and tensor approximation helpers.
This is the backend we can reason about inside Lean. Hardware CUDA/IEEE execution remains an implementation trust boundary unless it is connected to this model by a separately proved or certified semantics.