Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemma2 and flash-attention #32188

Merged
merged 5 commits into from
Jul 31, 2024
Merged

Gemma2 and flash-attention #32188

merged 5 commits into from
Jul 31, 2024

Commits on Jul 24, 2024

  1. Configuration menu
    Copy the full SHA
    fc97eab View commit details
    Browse the repository at this point in the history
  2. this works, not the prev

    zucchini-nlp committed Jul 24, 2024
    Configuration menu
    Copy the full SHA
    cad10d1 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    b4ca3ca View commit details
    Browse the repository at this point in the history

Commits on Jul 29, 2024

  1. Configuration menu
    Copy the full SHA
    9c0f447 View commit details
    Browse the repository at this point in the history

Commits on Jul 31, 2024

  1. not needed anymore

    zucchini-nlp committed Jul 31, 2024
    Configuration menu
    Copy the full SHA
    dc9266f View commit details
    Browse the repository at this point in the history