TransformerEncoderVerify #
LiRPA/IBP certificate checker: transformer-encoder-like graph.
This transformer encoder block includes:
- attention-like
softmaxflow, - residual additions,
layernorm, and- a 2-layer feed-forward network with
relu.
It exists primarily to exercise certificate checking across a wider set of nonlinear ops than the MLP/CNN workflows.
References:
- IBP: arXiv:1810.12715
https://arxiv.org/abs/1810.12715 - CROWN background:
https://arxiv.org/abs/1811.00866 - auto_LiRPA (reference implementation / exporter inspiration):
https://github.com/Verified-Intelligence/auto_LiRPA
Export (Python):
python3.12 scripts/verification/lirpa/export_crown_cert.py
Run (Lean):
lake exe verify -- lirpa-encoder [NN/Examples/Verification/LiRPA/transformer_encoder_cert.json]
Small fixed graph with residual + layernorm + FFN (see module doc).
Instances For
Seed deterministic parameters for the .linear / .matmul nodes in buildGraph.
Instances For
def
NN.Examples.Verification.LiRPA.TransformerEncoderVerify.seedInputFloat
(ps : MLTheory.CROWN.Graph.ParamStore Float)
(eps : Float)
:
Insert an L∞ input box of radius eps around a fixed center point.
Instances For
Check an IBP certificate JSON against this transformer-encoder graph.
This is wired into lake exe verify -- lirpa-encoder [path].