TorchLean API

NN.Examples.Data.Loaders.Cifar10Images

CIFAR10-style image loader tutorial (NPY, offline) #

This tutorial mirrors a classic PyTorch recipe:

  1. Load a labeled image dataset from disk (.npy exported from NumPy/PyTorch).
  2. Split into train/test.
  3. Build a small CNN by explicitly stacking layers.
  4. Train for multiple epochs over shuffled minibatches and report loss.

To keep this runnable without network downloads, generate a small deterministic "CIFAR10-shaped" dataset locally:

Generate it with:

python3 NN/Examples/Data/generate_toy_data.py

Build:

For command-line CIFAR training, use torchlean cnn, torchlean resnet, or torchlean vit with --x, --y, and --n-total.

Optional flags (tutorial-specific):

Why this tutorial matters:

@[reducible, inline]
Instances For
    @[reducible, inline]
    Instances For
      @[reducible, inline]
      Instances For
        @[reducible, inline]
        Instances For
          @[reducible, inline]
          Instances For

            Small CNN (no BatchNorm): Conv -> ReLU -> Pool -> Conv -> ReLU -> Pool -> Linear(10).

            Instances For

              Load the offline CIFAR10-like .npy dataset at the runtime-selected scalar type α.

              Instances For