IBPCert #
IBP “output bounds” certificate checking.
This is the simplest certificate checker in the repo: given a graph g and parameters ps, we:
- recompute IBP bounds in Lean (
runIBP), and - compare the final output interval
[lo,hi]against a Python-exported JSON certificate.
This is primarily used by the small LiRPA-style fixtures under
NN/Examples/Verification/LiRPA/*.
Certificate shape:
{ "result": { "lo": [...], "hi": [...] } }
Notes on trust boundaries:
- The JSON is an untrusted artifact; we only accept it if Lean recomputation agrees.
- Agreement is checked with a tolerance because the JSON uses decimal serialization.
- This checker validates an exported artifact against Lean execution. The theorem-backed path is
separate: use
NN.Entrypoint.Verificationwhen you need a Lean theorem connecting checker hypotheses to semantic enclosure.
References (informal):
- IBP: Gowal et al. (2018).
- CROWN/LiRPA param-store model: see
NN.MLTheory.CROWN.*.
Run IBP on (g, ps) and compare the output box at outId against the JSON certificate at path.
Returns true iff bounds match componentwise (within tolerance).
On mismatch, prints both Lean and JSON bounds for debugging.
Instances For
Convenience wrapper for examples: run check and raise a readable error on mismatch.
This keeps the verification examples focused on their fixture graphs and parameter stores rather than repeating the same “if !ok then throw” boilerplate everywhere.