TorchLean API

NN.MLTheory.CROWN.Operators.Activations

Supplemental Activation Operators #

This file defines additional activation functions and transfer rules intended for CROWN/IBP-style bound propagation:

Soundness status #

A few helpers in this file use low-order Taylor approximations (exp_approx, log_approx, tanh_approx, sigmoid_approx). These are executable approximations, not theorem-backed enclosures of the true transcendental functions. Bounds derived from them should only be used under an explicit checker/oracle assumption, not as part of the unconditional CROWN soundness core.

If you need enclosure-sound bounds for transcendentals, prefer validated enclosure backends (e.g. Arb-backed intervals in NN/Floats) or restrict to the operator subset covered by the main soundness theorems.

References #

Activations:

Bound propagation context:

Approximation Helpers #

Low-order Taylor approximation of exp. This is executable, not an enclosure theorem.

Instances For

    Low-order approximation of log(1 + y) around y = 0; not an enclosure theorem.

    Instances For

      Executable softplus approximation computed via logApprox (1 + expApprox x).

      Instances For

        Executable sigmoid approximation computed via expApprox.

        Instances For

          Low-order approximation of tanh; not an enclosure theorem.

          Instances For

            Leaky ReLU #

            def NN.MLTheory.CROWN.Operators.Activations.leakyRelu {α : Type} [Context α] (negSlope x : α) :
            α

            Leaky ReLU: f(x) = x if x ≥ 0, αx if x < 0.

            Instances For

              IBP for Leaky ReLU on scalars.

              Instances For

                IBP for Leaky ReLU on boxes.

                Instances For
                  def NN.MLTheory.CROWN.Operators.Activations.affLeakyRelu {α : Type} [Context α] (negSlope l u : α) :
                  α × α × α × α

                  Affine bounds for Leaky ReLU. Unlike ReLU, both branches are linear so relaxation is easier.

                  Instances For
                    def NN.MLTheory.CROWN.Operators.Activations.derivLeakyRelu {α : Type} [Context α] (negSlope l u : α) :
                    α × α

                    Derivative bounds for Leaky ReLU (for Lyapunov).

                    Instances For

                      ELU (Exponential Linear Unit) #

                      def NN.MLTheory.CROWN.Operators.Activations.elu {α : Type} [Context α] (scale x : α) :
                      α

                      ELU: f(x) = x if x > 0, α(exp(x) - 1) if x ≤ 0.

                      Instances For

                        IBP for ELU on scalars.

                        Instances For

                          IBP for ELU on boxes.

                          Instances For
                            def NN.MLTheory.CROWN.Operators.Activations.affElu {α : Type} [Context α] (scale l u : α) :
                            α × α × α × α

                            Affine bounds for ELU. Positive branch is linear, negative branch is nonlinear (exp).

                            Instances For
                              def NN.MLTheory.CROWN.Operators.Activations.derivElu {α : Type} [Context α] (scale l u : α) :
                              α × α

                              Derivative bounds for ELU.

                              Instances For

                                Softplus #

                                Softplus: f(x) = log(1 + exp(x)). Using approximation since Numbers.log may not be available.

                                Instances For

                                  IBP for Softplus. Softplus is monotone increasing.

                                  Instances For

                                    SiLU/Swish #

                                    SiLU/Swish: f(x) = x · sigmoid(x).

                                    Instances For

                                      IBP for SiLU. Not monotone, has global minimum around x ≈ -1.28.

                                      Instances For

                                        Mish #

                                        Mish: f(x) = x · tanh(softplus(x)).

                                        Instances For

                                          IBP for Mish. Similar behavior to SiLU.

                                          Instances For
                                            theorem NN.MLTheory.CROWN.Operators.Activations.Theorems.leaky_relu_def {α : Type} [Context α] (negSlope x : α) :
                                            leakyRelu negSlope x = if x > Numbers.zero then x else negSlope * x

                                            Leaky ReLU definition structure.

                                            ELU definition structure.

                                            Softplus definition structure.

                                            theorem NN.MLTheory.CROWN.Operators.Activations.Theorems.ibp_leaky_relu_scalar_pair {α : Type} [Context α] (negSlope l u : α) :
                                            ∃ (lo : α) (hi : α), ibpLeakyReluScalar negSlope l u = (lo, hi)

                                            IBP for leaky ReLU returns a pair.

                                            theorem NN.MLTheory.CROWN.Operators.Activations.Theorems.ibp_elu_scalar_pair {α : Type} [Context α] (scale l u : α) :
                                            ∃ (lo : α) (hi : α), ibpEluScalar scale l u = (lo, hi)

                                            IBP for ELU returns a pair.