TorchLean API

NN.Spec.Core.TensorBridge

TensorBridge #

Computable, row-major conversions between TensorArray.Tensor and Spec.Tensor.

Why we have this file:

This bridge gives explicit, easy-to-audit conversions:

Why “row-major”? #

Our spec tensor datatype is a nested dim tree: outer dimensions are modeled as functions Fin n → .... Our flatten recursion iterates outer indices first and then recurses inward, so the last axis varies fastest (the conventional row-major / C-order layout for matrices and images when you think of the outer dimension as “rows”).

We could have picked a different order, but we need one consistent convention for interop (CSV/NPY, torch tensor serialization, etc.). Row-major matches the contiguous layout assumptions used in many toolchains, so it’s a practical default.

Notes for proofs #

We don’t want “conversion code” to be a place where meaning quietly changes. So we prove the basic round-trip facts:

Those lemmas let us move between the functional spec view and the array/list view without handwaving.

Example file: NN.Examples.Advanced.Tensor.TensorBridge.

References / analogies:

Convert a Shape to a runtime list of dimensions.

Instances For

    Convert a runtime list of dimensions to a Shape.

    Instances For

      Shape conversion is involutive (applying it twice gives the original).

      List-to-shape conversion is involutive.

      def TensorBridge.flatten {α : Type} {shape : List } :
      Spec.Tensor α (listToShape shape)List α

      Flatten a Spec.Tensor of list-shape to row-major list.

      Instances For

        Length of flatten is exactly the product of dimensions.

        theorem TensorBridge.length_take_drop_mul {α : Type} (xs : List α) {n k : } (h : xs.length = n * k) (i : Fin n) :
        (List.take k (List.drop (i * k) xs)).length = k

        A block slice of size k has length k when the full list has length n*k.

        def TensorBridge.unflatten {α : Type} (shape : List ) (xs : List α) :

        Unflatten a row-major list into a Spec.Tensor of list-shape.

        Instances For
          theorem TensorBridge.flatMap_drop_take_eq {α : Type} (n k : ) (xs : List α) (h : xs.length = n * k) :
          List.flatMap (fun (i : Fin n) => List.take k (List.drop (i * k) xs)) (List.finRange n) = xs

          Reconstruct a list by concatenating its k-sized blocks.

          theorem TensorBridge.flatten_unflatten {α : Type} {shape : List } (xs : List α) (h : xs.length = TensorArray.shapeProd shape) :
          flatten (unflatten shape xs h) = xs

          Flatten after unflatten returns the original list.

          theorem TensorBridge.unflatten_flatten {α : Type} {shape : List } (t : Spec.Tensor α (listToShape shape)) {h : (flatten t).length = TensorArray.shapeProd shape} :
          unflatten shape (flatten t) h = t

          Unflatten after flatten returns the original tensor (for any valid length proof).

          def TensorBridge.toTensor {α : Type} {shape : List } :

          Convert from TensorArray.Tensor to Spec.Tensor.

          Instances For
            def TensorBridge.toTensorArray {α : Type} {shape : List } :

            Convert from Spec.Tensor to TensorArray.Tensor.

            Instances For

              Converting to tensor and back preserves the original tensor array.

              Converting to tensor array and back preserves the original tensor.

              A definable equivalence between the two tensor representations.

              Instances For