MLP robustness: basic analytic lemmas #
This file collects small real-analysis facts used by robustness statements for spec-level MLPs.
Contents:
- ReLU (written as
max 0 x) is 1-Lipschitz onℝ. - Lipschitz bounds compose under function composition.
- A few tensor-norm helper lemmas that connect to the more general library in
NN.Proofs.Analysis.Lipschitz.
References #
Basic Lipschitz lemmas #
Tensor helpers #
Definition: A non-zero tensor has at least one non-zero entry
Instances For
A non-zero tensor has positive L2 norm
Get the operator norm of the weight matrix in a linear layer
Instances For
Theorem: Linear layers are Lipschitz continuous with constant matrix_op_norm.
Repackage mlp_lipschitz_complete_analysis as a robustness-spec is_lipschitz_continuous fact.
This is the form expected by the certified-robustness lemmas in
NN.MLTheory.Proofs.Verification.Robustness.LipschitzCertified.
Robustness statements for MLPs (ML theory layer).
This file collects robustness definitions and theorems specialized to multi-layer perceptrons, used as a bridge between learning-theory specifications and concrete model classes.