Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attention body support for blas / eigen / mkl #1852

Merged
merged 49 commits into from
Mar 29, 2023

Conversation

almaudoh
Copy link
Contributor

Implementation of attentionbody for blas / eigen / mkl backends.

borg323 and others added 30 commits February 20, 2023 17:44
* add persistent L2 cache opt

- goal is to fit activations in residual block in L2 cache.
 - around 6.7% improvement in T80 networks.

* add checks for cuda version

- prevent compile errors when compiled with old CUDA toolkit.

* fix typo with cudart_version
* Fix softmax function in cuda.

* Add fix for negative zero.

* Code style fixes.

Co-authored-by: Alma <[email protected]>
@borg323 borg323 requested a review from Tilps February 21, 2023 12:56
src/neural/loader.cc Outdated Show resolved Hide resolved
Copy link
Member

@mooskagh mooskagh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me (although I didn't check for issues from NN perspective)

@borg323 borg323 merged commit 791cce4 into LeelaChessZero:master Mar 29, 2023
PikaCat-OuO pushed a commit to official-pikafish/px0 that referenced this pull request Apr 14, 2023
* remove mha transpose

* refactor blas encoder

* Attention body support to blas.

* Replace 64 with kSquares

* vectorize activation functions

---
Co-authored-by: borg323 <[email protected]>
Co-authored-by: Alma <[email protected]>

(cherry picked from commit 791cce4)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants