TorchLean API

NN.Spec.Models.LogisticRegression

Logistic regression (spec model) #

This file implements a small, deterministic logistic regression baseline.

Model (binary classification):

PyTorch mental model:

Notes:

Numerical note: PyTorch often uses BCEWithLogitsLoss for stability (it works directly on logits without forming sigmoid explicitly). Here we keep the math explicit.

structure LogisticRegression (p n : ) (α : Type) :

Parameters for logistic regression: a weight vector w and scalar intercept b.

We store intercept : α separately rather than folding it into weights, but fitLogistic internally learns (p + 1) parameters by augmenting the input with a trailing column of ones.

Instances For

    Augment an n × p design matrix with a final column of ones.

    This lets us represent the affine model X w + b as a single matrix-vector product with a (p + 1)-vector of parameters.

    Instances For

      Gradient of the logistic negative log-likelihood, expressed as Xᵀ (σ(Xw) - y).

      This is the standard expression used for (unregularized) logistic regression under labels y ∈ {0,1}. We do not divide by n here; callers can rescale if they want the mean loss.

      Instances For
        def fitLogistic {α : Type} [Context α] {n p : } (X : Spec.Tensor α (Spec.Shape.dim n (Spec.Shape.dim p Spec.Shape.scalar))) (y : Spec.Tensor α (Spec.Shape.dim n Spec.Shape.scalar)) (learning_rate : α) (iterations : ) :

        Fit logistic regression by plain gradient descent (structural recursion).

        This is a simple deterministic baseline that is easy to reason about. It does not attempt to match optimized solvers (LBFGS/Newton/IRLS); it is a small reference implementation that can be instantiated over different scalar backends.

        Instances For
          Instances For

            Predict probabilities σ(Xw + b) for each row in X.

            Instances For

              Convert probabilities to hard labels using a threshold (default 0.5).

              Instances For