NN.MLTheory.Stability.Spec #
Scalar-polymorphic stability definitions for discrete-time dynamical systems
x_{t+1} = f x_t over shape-indexed tensors.
Stability specifications (discrete-time dynamical systems) #
This module defines standard stability notions for iterated maps f : Tensor α s → Tensor α s,
phrased over TorchLean's shape-indexed tensors:
- Lyapunov stability,
- asymptotic/exponential stability,
- global stability, and
- input-to-state stability (ISS).
The definitions are polymorphic in the scalar type α via [Context α]; for noncomputable
quantities (e.g. the supremum defining a stability margin on ℝ), we expose the notion via a type
class StabilityMarginComputable. TorchLean installs the real supremum instance globally and keeps
the conservative 0 lower-bound instance behind an explicit opt-in scope for examples and tests.
References #
These are standard definitions in control theory / dynamical systems. Useful entry points include:
- H. K. Khalil, Nonlinear Systems (Lyapunov stability, exponential stability, ISS).
- E. D. Sontag, Input-to-State Stability: Basic Concepts and Results (ISS).
Iterate f for n steps: iterate f n x = f^[n] x.
Instances For
A computable notion of stability margin, matching the "largest invariant ball" intuition:
the largest r ≥ 0 such that the closed ball {x | dist(eq, x) ≤ r} is forward-invariant.
TorchLean only installs a real supremum-based instance globally for ℝ. Other scalar backends can
opt into the conservative lower-bound instance below explicitly; this avoids silently reporting 0 as a
semantic stability margin for arbitrary scalar types.
- compute_stability_margin {s : Spec.Shape} : (Spec.Tensor α s → Spec.Tensor α s) → ({s : Spec.Shape} → Spec.Tensor α s → α) → Spec.Tensor α s → α
Instances
Named opt-in scope for the conservative stability-margin lower bound.
Use open scoped NN.MLTheory.Stability.ConservativeMargin only in examples/tests that deliberately want
a total fallback. Production theorem statements should either use the ℝ instance or require an
explicit StabilityMarginComputable α hypothesis.
Instances For
Lyapunov stability of equilibrium for the discrete-time system x_{t+1} = f x_t.
This is the usual ε/δ definition using the distance induced by norm.
Instances For
Asymptotic stability: Lyapunov stability plus convergence to equilibrium for nearby initial
conditions.
Instances For
Exponential stability with decay parameters decay_rate and M.
This is a quantitative strengthening of asymptotic stability.
Instances For
Global stability: every initial condition converges to equilibrium.
This is stated as convergence in the distance induced by norm.
Instances For
Input-to-state stability (ISS) for an input-driven system x_{t+1} = f x_t u_t.
This is the standard bound ‖x_t‖ ≤ β(‖x_0‖, t) + γ( sup_{k ≤ t} ‖u_k‖ ) packaged as a Prop.
Instances For
Instances For
Instances For
Bounded-input bounded-output (BIBO) stability with respect to norm₁ and norm₂.
Instances For
Incremental stability: distances between trajectories contract by contraction_factor.
This is a discrete-time contraction condition phrased using tensor_distance.
Instances For
Stability margin: the largest forward-invariant ball radius around equilibrium (when computable).
Instances For
Finite-time stability: trajectories reach equilibrium exactly within a fixed step budget.
Instances For
Practical stability: trajectories eventually enter and remain in a fixed ultimate_bound ball.
Instances For
One-step monotonicity of a training loss under an update rule.
This is the “training stability” predicate used as a spec for decreasing-loss update rules.
Instances For
Generalization stability of a learning algorithm: small dataset changes produce small prediction changes.
This is a generic stability-style specification; concrete instances typically choose a specific dataset metric and output norm.