Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[API] Add LLaMA attention API. #378

Merged
merged 9 commits into from
May 11, 2024

Conversation

changqi1
Copy link
Contributor

@changqi1 changqi1 commented May 9, 2024

$ ./ut/layers_attention_test
[==========] Running 2 tests from 1 test case.
[----------] Global test environment set-up.
[----------] 2 tests from AttentionLLaMA
[ RUN      ] AttentionLLaMA.bfloat16_t
>> create context: 4096 128
>> create llama_attention_key: 0x7f8fe6804040_0x7f8fe6808040_0x7f8fe6808040_0x7f8fe2803040_1_128_32_32
[ RUNTIME  ] XFT::invokeAttentionLLaMA 0.153258 sec
[ RUNTIME  ] XFT::invokeAttentionLLaMA 0.004665 sec
[ RUNTIME  ] XFT::invokeAttentionLLaMA 0.001864 sec
[       OK ] AttentionLLaMA.bfloat16_t (755 ms)
[ RUN      ] AttentionLLaMA.float16_t
>> create context: 4096 128
>> create llama_attention_key: 0x7f8fe6804040_0x7f8fe6808040_0x7f8fe6808040_0x7f8fe2803040_2_128_32_32
[ RUNTIME  ] XFT::invokeAttentionLLaMA 0.119574 sec
[ RUNTIME  ] XFT::invokeAttentionLLaMA 0.046373 sec
[ RUNTIME  ] XFT::invokeAttentionLLaMA 0.039210 sec
[       OK ] AttentionLLaMA.float16_t (1601 ms)
[----------] 2 tests from AttentionLLaMA (2356 ms total)

[----------] Global test environment tear-down
[==========] 2 tests from 1 test case ran. (2357 ms total)
[  PASSED  ] 2 tests.

@Duyi-Wang Duyi-Wang added the interface related to interface label May 10, 2024
@changqi1 changqi1 changed the title [API] Add LLaMA attention and layer API. [API] Add LLaMA attention API. May 10, 2024
include/layers_attention.h Show resolved Hide resolved
src/layers/attention.cpp Show resolved Hide resolved
src/models/model_factory.h Show resolved Hide resolved
@changqi1
Copy link
Contributor Author

@pujiang2018 @Duyi-Wang Done.
Attention APIs don't have the LayerNorm function.
The LayerNorm function will add in Decoder Layer API.

@Duyi-Wang Duyi-Wang merged commit ac5f69f into intel:main May 11, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
interface related to interface
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants