Supplemental Activation Operators #
This file defines additional activation functions and transfer rules intended for CROWN/IBP-style bound propagation:
- Leaky ReLU:
f(x) = max(αx, x)(typically0 < α < 1) - ELU:
f(x) = xifx > 0, elseα (exp(x) - 1) - Softplus:
f(x) = log(1 + exp(x)) - Mish:
f(x) = x * tanh(softplus(x)) - SiLU/Swish:
f(x) = x * sigmoid(x)
Soundness status #
A few helpers in this file use low-order Taylor approximations (exp_approx, log_approx,
tanh_approx, sigmoid_approx). These are executable approximations, not theorem-backed
enclosures of the true transcendental functions. Bounds derived from them should only be used under
an explicit checker/oracle assumption, not as part of the unconditional CROWN soundness core.
If you need enclosure-sound bounds for transcendentals, prefer validated enclosure backends (e.g.
Arb-backed intervals in NN/Floats) or restrict to the operator subset covered by the main
soundness theorems.
References #
Activations:
- ELU: Clevert, Unterthiner, Hochreiter, "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)", ICLR 2016: https://arxiv.org/abs/1511.07289
- Swish/SiLU: Ramachandran, Zoph, Le, "Searching for Activation Functions", 2017: https://arxiv.org/abs/1710.05941
- Mish: Misra, "Mish: A Self Regularized Non-Monotonic Activation Function", 2019: https://arxiv.org/abs/1908.08681
Bound propagation context:
- CROWN: Zhang et al., "Efficient Neural Network Robustness Certification with General Activation Functions", 2018: https://arxiv.org/abs/1811.00866
- auto_LiRPA: Xu et al., "Automatic Perturbation Analysis for Scalable Certified Robustness and Beyond", NeurIPS 2020: https://arxiv.org/abs/2002.12920
Approximation Helpers #
Low-order Taylor approximation of exp. This is executable, not an enclosure theorem.
Instances For
Low-order approximation of log(1 + y) around y = 0; not an enclosure theorem.
Instances For
Executable sigmoid approximation computed via expApprox.
Instances For
Low-order approximation of tanh; not an enclosure theorem.
Instances For
Leaky ReLU #
Leaky ReLU: f(x) = x if x ≥ 0, αx if x < 0.
Instances For
IBP for Leaky ReLU on scalars.
Instances For
IBP for Leaky ReLU on boxes.
Instances For
Derivative bounds for Leaky ReLU (for Lyapunov).
Instances For
ELU (Exponential Linear Unit) #
ELU: f(x) = x if x > 0, α(exp(x) - 1) if x ≤ 0.
Instances For
IBP for ELU on scalars.
Instances For
IBP for ELU on boxes.
Instances For
Derivative bounds for ELU.
Instances For
Softplus #
Softplus: f(x) = log(1 + exp(x)). Using approximation since Numbers.log may not be available.
Instances For
IBP for Softplus. Softplus is monotone increasing.
Instances For
IBP for Softplus on boxes.
Instances For
SiLU/Swish #
SiLU/Swish: f(x) = x · sigmoid(x).
Instances For
IBP for SiLU. Not monotone, has global minimum around x ≈ -1.28.
Instances For
IBP for SiLU on boxes.
Instances For
Mish #
Mish: f(x) = x · tanh(softplus(x)).
Instances For
IBP for Mish. Similar behavior to SiLU.
Instances For
IBP for Mish on boxes.
Instances For
ELU definition structure.
Softplus definition structure.
IBP for leaky ReLU returns a pair.
IBP for ELU returns a pair.