Skip to content

请问如何在V100GPU上禁用Flash Attention #3967

@ppsychoo

Description

@ppsychoo

你好,我目前在8卡V100GPU上尝试使用GRPO对Qwen2.5vl进行强化学习训练,但是目前Flash Attention 2不支持V100,请问我如何配置参数或者修改代码才能禁用Flash Attention 2呢?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions