carmelog commited on
Commit
76ef0c6
·
verified ·
1 Parent(s): 06181bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +216 -1
README.md CHANGED
@@ -1,3 +1,218 @@
1
  ---
2
  license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ ---
4
+ # Earth-2 Checkpoints: FourCastNet 3
5
+
6
+ ## Description:
7
+
8
+ FourCastNet 3 advances global weather modeling by implementing a scalable, geometric
9
+ machine learning (ML) approach to probabilistic ensemble forecasting. The approach is
10
+ designed to respect spherical geometry and to accurately model the spatially
11
+ correlated probabilistic nature of the problem, resulting in stable spectra and
12
+ realistic dynamics across multiple scales. FourCastNet 3 delivers forecasting accuracy
13
+ that surpasses leading conventional ensemble models and rivals the best diffusion-based
14
+ methods, while producing forecasts 8 to 60 times faster than these approaches. In
15
+ contrast to other ML approaches, FourCastNet 3 demonstrates excellent probabilistic
16
+ calibration and retains realistic spectra, even at extended lead times of up to 60 days.
17
+ All of these advances are realized using a purely convolutional neural network
18
+ architecture specifically tailored for spherical geometry. Scalable and efficient
19
+ large-scale training on 1024 GPUs and more is enabled by a novel training paradigm for
20
+ combined model- and data-parallelism, inspired by domain decomposition methods in
21
+ classical numerical models. Additionally, FourCastNet 3 enables rapid inference on a
22
+ single GPU, producing a 60-day global forecast at 0.25°, 6-hourly resolution in under
23
+ 4 minutes. Its computational efficiency, medium-range probabilistic skill, spectral
24
+ fidelity, and rollout stability at subseasonal timescales make it a strong candidate
25
+ for improving meteorological forecasting and early warning systems through large
26
+ ensemble predictions.
27
+
28
+ ![FCN3 15 member ensemble](https://raw.githubusercontent.com/NVIDIA/makani/main/images/fcn3_ens15.gif)
29
+
30
+ This model is ready for commercial/non-commercial use.
31
+
32
+ ### License/Terms of Use:
33
+
34
+ [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0)
35
+
36
+ ### Deployment Geography:
37
+
38
+ Global
39
+
40
+ ### Use Case:
41
+
42
+ Industry, academic, and government research teams interested in medium-range and
43
+ subseasonal-to-seasonal weather forecasting, and climate modeling.
44
+
45
+ ### Release Date:
46
+
47
+ NGC 07/18/2025
48
+
49
+ ## Reference:
50
+
51
+ **Papers**:
52
+
53
+ - [FourCastNet 3: A geometric approach to probabilistic machine-learning weather forecasting at scale](https://arxiv.org/abs/2507.12144v2)
54
+ - [Neural Operators with Localized Integral and Differential Kernels](https://arxiv.org/abs/2402.16845)
55
+ - [Huge Ensembles Part I: Design of Ensemble Weather Forecasts using Spherical Fourier Neural Operators](https://arxiv.org/abs/2408.03100)
56
+ - [Huge Ensembles Part II: Properties of a Huge Ensemble of Hindcasts Generated with Spherical Fourier Neural Operators](https://arxiv.org/abs/2408.01581)
57
+ - [Spherical Fourier Neural Operators: Learning Stable Dynamics on the Sphere](https://arxiv.org/abs/2306.03838)
58
+
59
+ **Code**:
60
+
61
+ - [Makani](https://github.com/NVIDIA/makani)
62
+
63
+ - [PhysicsNeMo](https://github.com/NVIDIA/physicsnemo)
64
+
65
+ - [Earth2Studio](https://github.com/NVIDIA/earth2studio)
66
+
67
+ - [torch-harmonics](https://github.com/NVIDIA/torch-harmonics)
68
+
69
+ ## Model Architecture:
70
+
71
+ **Architecture Type:** Spherical Neural Operator. A fully convolutional architecture
72
+ based on group convolutions defined on the sphere. Leverages both local and global
73
+ convolutions. For details regarding the architecture refer to the
74
+ [FourCastNet 3 paper](https://arxiv.org/abs/2507.12144v1). <br>
75
+
76
+ **Network Architecture:** N/A <br>
77
+
78
+ **Number of model parameters:** 710,867,670
79
+
80
+ **Model datatype:** We recommend that the model is run in AMP with bf16, however, the
81
+ inputs and outputs are typically float32.
82
+
83
+ ## Input:
84
+
85
+ **Input Type:**
86
+
87
+ - Tensor (72 surface and pressure-level variables)
88
+
89
+ **Input Format:** PyTorch Tensor <br>
90
+ **Input Parameters:**
91
+
92
+ - Six Dimensional (6D) (batch, time, lead time, variable, latitude, longitude) <br>
93
+
94
+ **Other Properties Related to Input:**
95
+
96
+ - Input equi-rectangular latitude/longitude grid: 0.25 degree 721 x 1440
97
+ - Input state weather variables: `u10m`, `v10m`, `u100m`, `v100m`, `t2m`, `msl`,
98
+ `tcwv`, `u50`, `u100`, `u150`, `u200`, `u250`, `u300`, `u400`, `u500`, `u600`, `u700`,
99
+ `u850`, `u925`, `u1000`, `v50`, `v100`, `v150`, `v200`, `v250`, `v300`, `v400`, `v500`,
100
+ `v600`, `v700`, `v850`, `v925`, `v1000`, `z50`, `z100`, `z150`, `z200`, `z250`, `z300`,
101
+ `z400`, `z500`, `z600`, `z700`, `z850`, `z925`, `z1000`, `t50`, `t100`, `t150`, `t200`,
102
+ `t250`, `t300`, `t400`, `t500`, `t600`, `t700`, `t850`, `t925`, `t1000`, `q50`, `q100`,
103
+ `q150`, `q200`, `q250`, `q300`, `q400`, `q500`, `q600`, `q700`, `q850`, `q925`, `q1000`
104
+ - Time: datetime64
105
+
106
+ For variable name information, review the Lexicon at [Earth2Studio](https://github.com/NVIDIA/earth2studio).
107
+
108
+ ## Output:
109
+
110
+ **Output Type:** Tensor (72 surface and pressure-level variables) <br>
111
+ **Output Format:** Pytorch Tensor <br>
112
+ **Output Parameters:** Six Dimensional (6D) (batch, time, lead time, variable,
113
+ latitude, longitude) <br>
114
+ **Other Properties Related to Output:**
115
+
116
+ - Output latitude/longitude grid: 0.25 degree 721 x 1440, same as input.
117
+ - Output state weather variables: same as above.
118
+
119
+ Our AI models are designed and/or optimized to run on NVIDIA GPU-accelerated systems.
120
+ By leveraging NVIDIA’s hardware (e.g. GPU cores) and software frameworks (e.g., CUDA
121
+ libraries), the model achieves faster training and inference times compared to
122
+ CPU-only solutions.
123
+
124
+ ## Software Integration
125
+
126
+ **Runtime Engine:** Pytorch <br>
127
+ **Supported Hardware Microarchitecture Compatibility:** <br>
128
+
129
+ - NVIDIA Ampere <br>
130
+ - NVIDIA Hopper <br>
131
+ - NVIDIA Turing <br>
132
+
133
+ **Supported Operating System:**
134
+
135
+ - Linux <br>
136
+
137
+ ## Model Version:
138
+
139
+ **Model Version:** v1 <br>
140
+
141
+ ## Training, Testing, and Evaluation Datasets:
142
+
143
+ **Total size (in number of data points):** 110,960 <br>
144
+ **Total number of datasets:** 1<br>
145
+ **Dataset partition:** training 95%, testing 2.5%, validation 2.5% <br>
146
+
147
+ ## Training Dataset:
148
+
149
+ **Link:** [ERA5](https://cds.climate.copernicus.eu/) <br>
150
+
151
+ **Data Collection Method by dataset** <br>
152
+
153
+ - Automatic/Sensors <br>
154
+
155
+ **Labeling Method by dataset** <br>
156
+
157
+ - Automatic/Sensors <br>
158
+
159
+ **Properties:**
160
+ ERA5 data for the period 1980-2015. ERA5 provides hourly estimates of various
161
+ atmospheric, land, and oceanic climate variables. The data covers the Earth on a 30km
162
+ grid and resolves the atmosphere at 137 levels. <br>
163
+
164
+ ## Testing Dataset:
165
+
166
+ **Link:** [ERA5](https://cds.climate.copernicus.eu/) <br>
167
+
168
+ **Data Collection Method by dataset** <br>
169
+
170
+ - Automatic/Sensors <br>
171
+
172
+ **Labeling Method by dataset** <br>
173
+
174
+ - Automatic/Sensors <br>
175
+
176
+ **Properties:**
177
+ ERA5 data for the period 2016-2017. ERA5 provides hourly estimates of various
178
+ atmospheric, land, and oceanic climate variables. The data covers the Earth on a 30km
179
+ grid and resolves the atmosphere at 137 levels. <br>
180
+
181
+ ## Evaluation Dataset:
182
+
183
+ **Link:** [ERA5](https://cds.climate.copernicus.eu/) <br>
184
+
185
+ **Data Collection Method by dataset** <br>
186
+
187
+ - Automatic/Sensors <br>
188
+
189
+ **Labeling Method by dataset** <br>
190
+
191
+ - Automatic/Sensors <br>
192
+
193
+ **Properties:**
194
+ ERA5 data for the period 2018-2019. ERA5 provides hourly estimates of various
195
+ atmospheric, land, and oceanic climate variables. The data covers the Earth on a 30km
196
+ grid and resolves the atmosphere at 137 levels. <br>
197
+
198
+ ## Inference:
199
+
200
+ **Acceleration Engine:** Pytorch <br>
201
+ **Test Hardware:**
202
+
203
+ - A100 <br>
204
+ - H100 <br>
205
+ - L40S <br>
206
+
207
+ ## Ethical Considerations:
208
+
209
+ NVIDIA believes Trustworthy AI is a shared responsibility and we have established
210
+ policies and practices to enable development for a wide array of AI applications.
211
+ When downloaded or used in accordance with our terms of service, developers should
212
+ work with their internal model team to ensure this model meets requirements for the
213
+ relevant industry and use case and addresses unforeseen product misuse.
214
+
215
+ For more detailed information on ethical considerations for this model, please see the
216
+ Model Card++ Explainability, Bias, Safety & Security, and Privacy Subcards.
217
+
218
+ Please report model quality, risk, security vulnerabilities or NVIDIA AI Concerns [here](https://www.nvidia.com/en-us/support/submit-security-vulnerability/).