Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,74 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: mit
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
tags:
|
| 6 |
+
- physics
|
| 7 |
+
- PDEs
|
| 8 |
+
- surrogate
|
| 9 |
+
- heat-equation
|
| 10 |
+
- diffusion
|
| 11 |
+
base_model: thuerey-group/pde-transformer
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
# PDE-Transformer mc-s – 2D Diffusion (Heat Equation) Fine-Tuned
|
| 15 |
+
|
| 16 |
+
This repository provides a **fine-tuned version of PDE-Transformer (mc-s)** specialized for
|
| 17 |
+
2D heat diffusion on regular grids. The base model is the mixed-channel small variant
|
| 18 |
+
from `thuerey-group/pde-transformer`, further trained on synthetic solutions of the
|
| 19 |
+
2D diffusion / heat equation with Gaussian bump initial conditions.
|
| 20 |
+
|
| 21 |
+
The goal of this model is to act as a **surrogate solver** for short-time predictions
|
| 22 |
+
of the heat equation, given a pair of previous states.
|
| 23 |
+
|
| 24 |
+
> **Input:** 2-channel field `[u(t0), u(t1)]`
|
| 25 |
+
> **Output:** next-step prediction `u(t2)` (via channel index 1 in the model output)
|
| 26 |
+
|
| 27 |
+
---
|
| 28 |
+
|
| 29 |
+
## 🌐 Project Links
|
| 30 |
+
|
| 31 |
+
- **Fine-tuning scripts & experiments**: [https://github.com/psmteja/agentic_ai_PDE_fm](https://github.com/psmteja/agentic_ai_PDE_fm)
|
| 32 |
+
|
| 33 |
+
---
|
| 34 |
+
|
| 35 |
+
## 📝 Model Description
|
| 36 |
+
|
| 37 |
+
PDE-Transformer is a transformer-based foundation model for physics simulations on regular grids.
|
| 38 |
+
It combines architectural ideas from diffusion transformers with design choices tailored to
|
| 39 |
+
large-scale physical simulations.
|
| 40 |
+
|
| 41 |
+
This checkpoint starts from the **mixed-channel small (mc-s)** variant and is **fine-tuned only on 2D diffusion**:
|
| 42 |
+
|
| 43 |
+
- Equation:
|
| 44 |
+
\[
|
| 45 |
+
\partial_t u = \nu (u_{xx} + u_{yy})
|
| 46 |
+
\]
|
| 47 |
+
- Domain: \([-1, 1]^2\) discretized on a regular grid (e.g. \(64 \times 64\))
|
| 48 |
+
- Boundary conditions: periodic
|
| 49 |
+
- Initial condition: random 2D Gaussian bumps (random center, width, amplitude)
|
| 50 |
+
- Training target: finite-difference solution `u(t2)` given `[u(t0), u(t1)]`
|
| 51 |
+
|
| 52 |
+
### What this model is good for
|
| 53 |
+
|
| 54 |
+
- Fast surrogate for **2D heat equation rollouts** over short time horizons.
|
| 55 |
+
- Experiments in:
|
| 56 |
+
- surrogate modeling,
|
| 57 |
+
- model-based control for diffusion-like processes,
|
| 58 |
+
- benchmarking PDE foundation models on simple physics.
|
| 59 |
+
|
| 60 |
+
### What this model is *not* guaranteed to handle
|
| 61 |
+
|
| 62 |
+
- Arbitrary PDEs outside diffusion (e.g. Navier–Stokes, Burgers, reaction–diffusion)
|
| 63 |
+
→ use the original foundation model or fine-tune separately.
|
| 64 |
+
- Very different resolutions or domain geometries than used during training,
|
| 65 |
+
unless you explicitly adapt / re-fine-tune.
|
| 66 |
+
|
| 67 |
+
---
|
| 68 |
+
|
| 69 |
+
## 📦 Installation
|
| 70 |
+
|
| 71 |
+
Install the PDE-Transformer package and dependencies:
|
| 72 |
+
|
| 73 |
+
```bash
|
| 74 |
+
pip install pdetransformer torch numpy matplotlib
|