transformers python-box einops omegaconf pytorch_lightning lightning addict timm fast-simplification bpy>=3.6.0 trimesh open3d pyrender huggingface_hub numpy==1.26.4 scipy matplotlib plotly pyyaml https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl