Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revisit attention layer for fp16 test and apply template in activation function #28

Merged

Commits on Aug 10, 2023

  1. [gtest] Add dataset file for attention layer

        * Now nnlayergolden binary file for attention layer gtest will be automatically generated when build
    
    Signed-off-by: skykongkong8 <[email protected]>
    skykongkong8 committed Aug 10, 2023
    Configuration menu
    Copy the full SHA
    c6076c3 View commit details
    Browse the repository at this point in the history
  2. [layers/activation_func] Apply template on activation functions

    **Changes proposed in this PR:**
    
    - For mixed precision, activation functions should be revised to a function template to avoid bulky code
    - In order to use function template for setActivation, we need another function template to handle multiple types of activation function
    - Minor fixes for template instantiation, and this will be revised proplerly for fp16 use in the next PR
    
    Resolves:
    
    **Self evaluation:**
    1. Build test:     [X]Passed [ ]Failed [ ]Skipped
    2. Run test:     [X]Passed [ ]Failed [ ]Skipped
    
    Signed-off-by: sungsik.kong <[email protected]>
    skykongkong8 committed Aug 10, 2023
    Configuration menu
    Copy the full SHA
    dde0727 View commit details
    Browse the repository at this point in the history
  3. [gtest] Verify attention layer with fp16

    - Add fp16 test case
    - Modify epsilon value in cosine similarity with proper decimal number & significant digit
    
    Resolves:
    
    **Self evaluation:**
    1. Build test:     [X]Passed [ ]Failed [ ]Skipped
    2. Run test:     [X]Passed [ ]Failed [ ]Skipped
    
    Signed-off-by: sungsik.kong <[email protected]>
    skykongkong8 committed Aug 10, 2023
    Configuration menu
    Copy the full SHA
    6710963 View commit details
    Browse the repository at this point in the history