Native TorchLean FNO1D Burgers #
This file is the operator-learning tutorial we want people to read after the basic CNN/MLP
examples. The Python helpers do the two jobs Lean should not own here: download/reshape the public
burgers_data_R10.mat file, then plot the prediction CSV. The model, loss, optimizer, and training
loop stay in TorchLean.
Why we use the real-split FNO path in this executable:
NN.FNO1D.modelis the mathematically clean complex-domain implementation.- The eager CUDA backend stores float32 buffers, not complex buffers.
- On CUDA this run uses the fused
spectralConv1dRfftautograd primitive, which represents Fourier weights by real/imaginary float32 buffers and executes the real FFT path through cuFFT. - On CPU it falls back to the dense DFT implementation. That is slower, but it is the useful reference path when someone wants to inspect the math without CUDA in the way.
The training task follows the standard FNO Burgers setup: learn the operator
u₀(x) ↦ u(x,T) on a fixed periodic grid. We keep the default grid and row counts modest because
the first run should answer one question quickly: "is my TorchLean/CUDA path wired correctly?" Once
that works, raise --steps, export more rows, and bump the constants below.
References for the dataset/training convention:
- Li et al., “Fourier Neural Operator for Parametric Partial Differential Equations”, 2020/2021.
- MathWorks’ Burgers FNO example and the
burgers_data_R10.matpublic dataset. - SciML FNO tutorials using fields
afor initial conditions andufor final solutions.
Instances For
Instances For
Instances For
- trainX : System.FilePath
- trainY : System.FilePath
- testX : System.FilePath
- testY : System.FilePath
Instances For
Instances For
- train : TrainConfig
- files : DataFiles
- plotCsv : System.FilePath
- logJson : System.FilePath
Instances For
Instances For
Instances For
Instances For
Instances For
Instances For
- train : Runtime.Autograd.Train.Dataset (API.sample.Supervised α σ τ)
- test : Runtime.Autograd.Train.Dataset (API.sample.Supervised α σ τ)