TensorBridge #
Computable, row-major conversions between TensorArray.Tensor and Spec.Tensor.
Why we have this file:
Spec.Tensoris the spec representation (functional, shape-indexed).TensorArray.Tensoris an array-backed representation that is convenient for IO and external interfaces (CSV/NPY/etc).
This bridge gives explicit, easy-to-audit conversions:
flatten/unflattenbetweenSpec.Tensorand a flat row-major list,- plus shape list conversions (
Shape ↔ List Nat) to connect the typed and runtime views.
Why “row-major”? #
Our spec tensor datatype is a nested dim tree: outer dimensions are modeled as functions
Fin n → .... Our flatten recursion iterates outer indices first and then recurses inward, so
the last axis varies fastest (the conventional row-major / C-order layout for matrices and
images when you think of the outer dimension as “rows”).
We could have picked a different order, but we need one consistent convention for interop (CSV/NPY, torch tensor serialization, etc.). Row-major matches the contiguous layout assumptions used in many toolchains, so it’s a practical default.
Notes for proofs #
We don’t want “conversion code” to be a place where meaning quietly changes. So we prove the basic round-trip facts:
- flattening after unflattening gives you the original list, and
- unflattening after flattening gives you the original tensor.
Those lemmas let us move between the functional spec view and the array/list view without handwaving.
Example file: NN.Examples.Advanced.Tensor.TensorBridge.
References / analogies:
- Row-major ("C-order") is the default contiguous layout in many toolchains (NumPy/PyTorch) and is a common convention for serialization. This file fixes a single convention so interop is auditably well-defined.
- PyTorch docs (intuition for the operations we mirror here, not the semantics):
torch.flatten: https://pytorch.org/docs/stable/generated/torch.flatten.htmltorch.Tensor.reshape: https://pytorch.org/docs/stable/generated/torch.Tensor.reshape.htmltorch.Tensor.view: https://pytorch.org/docs/stable/generated/torch.Tensor.view.htmltorch.Tensor.contiguous: https://pytorch.org/docs/stable/generated/torch.Tensor.contiguous.html
- NumPy docs (C-order / reshape conventions):
Convert a Shape to a runtime list of dimensions.
Instances For
Convert a runtime list of dimensions to a Shape.
Instances For
Shape conversion is involutive (applying it twice gives the original).
List-to-shape conversion is involutive.
Flatten a Spec.Tensor of list-shape to row-major list.
Instances For
Length of flatten is exactly the product of dimensions.
Unflatten a row-major list into a Spec.Tensor of list-shape.
Instances For
Unflatten after flatten returns the original tensor (for any valid length proof).
Convert from TensorArray.Tensor to Spec.Tensor.
Instances For
Convert from Spec.Tensor to TensorArray.Tensor.
Instances For
Converting to tensor and back preserves the original tensor array.
Converting to tensor array and back preserves the original tensor.
A definable equivalence between the two tensor representations.