α-CROWN transfer step (graph dialect) #
This file defines a pure, per-node transfer rule for affine bound propagation in the
NN.MLTheory.CROWN.Graph dialect, extended with an α-parameter for the ReLU lower relaxation
(α-CROWN).
The step function is shared by:
- the certificate checker (recompute each node from its parents and compare to a claimed bound), and
- soundness theorems of the form: "if the checker accepts, then the claimed enclosure holds".
This module does not implement the outer dual-parameter optimization loop used by α/β-CROWN; it only defines the local transfer rule for a fixed set of α-parameters.
References #
- CROWN: Zhang et al., Efficient Neural Network Robustness Certification with General Activation Functions, NeurIPS 2018. (arXiv:1811.00866)
- β-CROWN (and the α/β-CROWN toolchain): Wang et al., Beta-CROWN: Efficient Bound Propagation with Provable Guarantees, NeurIPS 2021. (arXiv:2103.06624)
Small affine helpers (non-private) #
Affine bounds in the graph dialect #
A FlatAffineBounds α stores two affine maps (lower/upper) of the flattened input vector:
[
\ell(x) = A_\ell x + c_\ell,\qquad u(x) = A_u x + c_u.
]
These are propagated through the graph using local transfer rules.
The certificate checker recomputes these affine maps node-by-node, so the functions below are
written in an executable style (using the repo’s Tensor operations), while still being usable
in theorem statements over ℝ.
Identity affine bounds: both lower and upper are aff_identity.
Instances For
Constant affine bounds for a node output.
Both maps have A = 0; the offsets are the provided endpoint vectors.
Instances For
α-ReLU lower relaxation #
α-CROWN lower relaxation for ReLU #
For a pre-activation scalar (z\in[l,u]), CROWN/DeepPoly uses:
- an upper linear envelope (the usual triangular relaxation), and
- a lower linear envelope.
In the unstable crossing case (l < 0 < u), the lower relaxation can be parameterized by (\alpha\in[0,1]) to interpolate between the sound choices (y \ge 0) and (y \ge z).
We encode this by using alphaRelaxLowerScalar for the lower bound, and using
Runtime.Ops.ReLU.relax_scalar / relax_vector for the upper bound.
Instances For
Vectorized α-CROWN lower relaxation for ReLU, applied componentwise.
Instances For
Node step function #
Instances For
Safe lookup of the optional α vector at node id pid.
Instances For
Default α vector used when the certificate omits α values.
This matches TorchLean's default lower relaxation: pick slope 1 when u > -l, otherwise 0.
Instances For
Transfer rule for affine bounds through a linear layer y = W x + b (sign-splitting).
The parent node provides affine lower/upper bounds in terms of the global input; we compose with
W using the standard W = W⁺ + W⁻ decomposition (as in IBP/CROWN).
Instances For
One-node α-CROWN step function for a supported subset of IR ops.
This is a safe (Option-returning) step: it returns none when required parent bounds or
parameters are missing, or when dimensions mismatch.
It is intended to be used for:
- executable per-node certificate checking (recompute node
idfrom certificate parents), and - proof-level soundness theorems about the checker.
Supported node kinds #
This step function handles the verifier core of the IR:
.input,.const,.detach.linear,.matmul(ParamStore-driven linear operators in the verifier dialect).relu(CROWN upper + α-CROWN lower).sum(treated as a (1\times n) linear layer).reshape,.flatten(shape-only, guarded by dimensional consistency)
All other node kinds fall back to a conservative constant affine enclosure derived from the IBP box at the same node id (if present). This keeps the checker total over graphs that contain operators outside this affine-transfer subset; end-to-end theorems then account for those nodes through the soundness assumptions attached to their IBP boxes.