Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问如何在该项目基础上做微调时启用FlashAttention-2的高效注意力? #26

Closed
3 tasks done
RethinkFun opened this issue Jul 31, 2023 · 3 comments
Closed
3 tasks done
Labels

Comments

@RethinkFun
Copy link

提交前必须检查以下项目

  • 请确保使用的是仓库最新代码(git pull),一些问题已被解决和修复。
  • 我已阅读项目文档FAQ章节并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案
  • 第三方插件问题:例如llama.cpptext-generation-webui等,同时建议到对应的项目中查找解决方案

问题类型

其他问题

基础模型

Alpaca-2-7B

操作系统

Linux

详细描述问题

# 请在此处粘贴运行代码(如没有可删除该代码块)

依赖情况(代码类问题务必提供)

# 请在此处粘贴依赖情况

运行日志或截图

# 请在此处粘贴运行日志
@airaria
Copy link
Contributor

airaria commented Jul 31, 2023

flash-attention相关的训练代码还在整理中,可以先参考fastchat中使用flash-attention的代码:
https://github.com/lm-sys/FastChat/blob/main/fastchat/train/llama_flash_attn_monkey_patch.py

@RethinkFun
Copy link
Author

flash-attention相关的训练代码还在整理中,可以先参考fastchat中使用flash-attention的代码: https://github.com/lm-sys/FastChat/blob/main/fastchat/train/llama_flash_attn_monkey_patch.py

感谢回复

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

@github-actions github-actions bot added the stale label Aug 10, 2023
@ymcui ymcui closed this as completed Aug 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants