TorchLean API

NN.Proofs.Autograd.FDeriv.Elementwise

Elementwise #

Elementwise (map) Fréchet-derivative facts for Euclidean vectors.

This is the missing bridge for turning the scalar calculus lemmas in NN/Proofs/Gradients/Activation.lean into OpSpecFDerivCorrect instances for vector-valued ops (sigmoid/tanh/softplus/…).

noncomputable def Proofs.Autograd.elemwiseVec {n : } (f : ) :
Vec nVec n

Apply a scalar function f : ℝ → ℝ coordinatewise to a vector.

This is the Euclidean-space analogue of the tensor-level map_spec.

Instances For

    Coordinate evaluation as a continuous linear map on Vec n.

    Instances For
      @[simp]
      theorem Proofs.Autograd.evalCLM_apply {n : } (i : Fin n) (x : Vec n) :
      (evalCLM i) x = x.ofLp i
      noncomputable def Proofs.Autograd.elemwiseDerivCLM {n : } (f' : ) (x : Vec n) :

      The derivative candidate for elemwiseVec f at a point x, built from a proposed scalar derivative f'.

      Concretely: (elemwiseDerivCLM f' x) dx has coordinates i ↦ f'(xᵢ) * dxᵢ.

      Instances For
        theorem Proofs.Autograd.hasFDerivAt_elemwiseVec {n : } {f f' : } (x : Vec n) (hf : ∀ (z : ), HasDerivAt f (f' z) z) :

        If f is differentiable everywhere with derivative f', then elemwiseVec f is Fréchet differentiable everywhere with derivative elemwiseDerivCLM f'.

        theorem Proofs.Autograd.hasFDerivAt_elemwiseVec_at {n : } {f f' : } (x : Vec n) (hf : ∀ (i : Fin n), HasDerivAt f (f' (x.ofLp i)) (x.ofLp i)) :

        Pointwise (at x) version of hasFDerivAt_elemwiseVec.

        This is useful when the scalar HasDerivAt facts are only available at the coordinates of x.

        @[simp]
        theorem Proofs.Autograd.toVec_map_spec_ofVecE {n : } (f : ) (xV : Vec n) (i : Fin n) :

        Evaluation lemma: converting an elementwise-mapped tensor back to coordinates agrees with applying f to the corresponding Euclidean coordinate.

        exp as an OpSpecFDerivCorrect instance (elementwise Real.exp).

        Instances For

          square as an OpSpecFDerivCorrect instance (elementwise x ↦ x^2).

          Instances For

            sinh as an OpSpecFDerivCorrect instance (elementwise).

            Instances For

              cosh as an OpSpecFDerivCorrect instance (elementwise).

              Instances For

                tanh as an OpSpecFDerivCorrect instance (elementwise).

                Instances For

                  sigmoid as an OpSpecFDerivCorrect instance (elementwise).

                  Instances For

                    softplus as an OpSpecFDerivCorrect instance (elementwise).

                    Instances For

                      SiLU as an OpSpecFDerivCorrect instance (elementwise).

                      Instances For

                        Tanh-approximate GELU as an OpSpecFDerivCorrect instance (elementwise).

                        Instances For
                          noncomputable def Proofs.Autograd.OpSpecFDerivCorrect.safeLog {n : } (ε : ) ( : 0 < ε) :

                          safe_log as an OpSpecFDerivCorrect instance (elementwise), assuming ε > 0.

                          This is the differentiable calculus fact; the corresponding dot-level VJP correctness lives in NN.Proofs.Autograd.Core.RealCorrectness.

                          Instances For
                            noncomputable def Proofs.Autograd.OpSpecFDerivCorrect.smoothAbs {n : } (ε : ) ( : 0 < ε) :

                            smooth_abs as an OpSpecFDerivCorrect instance (elementwise), assuming ε > 0.

                            This is a differentiable approximation to abs.

                            Instances For