layer_norm 无法安装 #154
Unanswered
running-frog
asked this question in
Q&A
Replies: 1 comment
-
Flash Attention里有个新issue提了这事儿。应该是只在A100上做了测试,RTX 3080的按理说应该也行,架构一致。看Compute Capability的话1,RTX 3080是8.6,A100是8.0,两者差异2可能会对kernel的实际效率有影响。 建议不装Flash Attention看看速度如何,如果满足需求的话,不是必需装它。README里也有开不开Flash Attention的实测效率数据(卡型确实不一样),可以参考看看。 Footnotes |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Warning: import flash_attn rms_norm fail, please install FlashAttention layer_norm to get higher efficiency https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm
看了说明,说只适配A100,RTX3080怎么办?
Beta Was this translation helpful? Give feedback.
All reactions