Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[common] merge ChatGLM2Attention::forward into Attention::forward #86

Merged
merged 2 commits into from
Dec 1, 2023

Commits on Nov 27, 2023

  1. merge ChatGLM2Attention::forward into Attention::forward

    add epsilon param into LayerNorm to align with RmsNorm (LayerNorm doesn't use this param)
    
    extend qk_shape from 4 to 5 in attention.h, to pass key_head_num into rotary_embedding_chatglm2 for multi-query-attention.
    a3213105 committed Nov 27, 2023
    Configuration menu
    Copy the full SHA
    aee86ac View commit details
    Browse the repository at this point in the history

Commits on Nov 29, 2023

  1. Configuration menu
    Copy the full SHA
    0e508f6 View commit details
    Browse the repository at this point in the history