TorchLean API

NN.MLTheory.Proofs.Approximation.Universal.UniversalApproximationIEEE32ExecMlp2

IEEE32Exec two-layer ReLU approximation bound #

This file proves the reusable three-term error decomposition for executing a single-hidden-layer ReLU MLP under concrete IEEE binary32 semantics.

The theorem separates the three mathematically different sources of error:

This is the finite-dimensional analogue of the hinge-network executable bound in UniversalApproximationIEEE32Exec. The decomposition follows the standard numerical-analysis pattern for floating-point algorithms: prove the real algorithm correct, bound data/parameter rounding, and bound arithmetic rounding separately. For background, see IEEE Std 754-2019, Goldberg (1991), Higham (2002), and the ReLU density literature of Cybenko, Hornik, Leshno, and Pinkus.

Three-term bound #

Read this as:

Given an IEEE32Exec input xI, let xR be its real interpretation (toReal elementwise). If:

  1. the target f is approximated by a real 2-layer ReLU MLP (error ≤ εApprox),
  2. the real MLP is close to the real interpretation of the IEEE parameters (error ≤ εQ),
  3. executing the IEEE MLP and then mapping to reals is close to the real interpretation of those IEEE parameters (error ≤ εR),

then the IEEE32Exec execution approximates f within εApprox + εQ + εR.