File size: 348 Bytes
29d99a6
 
a252b0c
29d99a6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6eddb24
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
transformers
python-box
einops
omegaconf
pytorch_lightning
lightning
addict
timm
fast-simplification
bpy>=3.6.0
trimesh
open3d
pyrender
huggingface_hub
numpy==1.26.4
scipy
matplotlib
plotly
pyyaml
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl