Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Wait for #2562 ] [ Weight ] Add Var32 Tensor in Weight @open sesame 05/03 15:14 #2563

Closed
wants to merge 5 commits into from

Commits on May 2, 2024

  1. [Property] Add loss scale property

    It add loss scale property as model common property.
    
    Signed-off-by: Jiho Chu <[email protected]>
    jihochu authored and jijoongmoon committed May 2, 2024
    Configuration menu
    Copy the full SHA
    06e2313 View commit details
    Browse the repository at this point in the history
  2. [ Weight ] Add Loss Scale factor in Weight

    This PR enables the loss scale factor in Weight.
    . Change the WeightSpec to incluide the loss factor
    . Add LossScaleForMixed Property as an layer common property, so that
      it can set the scale factor in initContext.
    . Add Loss Scale in initContext
    . Set the LossScaleForMixed Property when there is LossScale Model
      Property
    
    Resolves:
    
    **Self evaluation:**
    1. Build test:	 [X]Passed [ ]Failed [ ]Skipped
    2. Run test:	 [X]Passed [ ]Failed [ ]Skipped
    
    Signed-off-by: jijoong.moon <[email protected]>
    jijoongmoon committed May 2, 2024
    Configuration menu
    Copy the full SHA
    64f57f7 View commit details
    Browse the repository at this point in the history
  3. [ Weight ] split variable dim and grad dim to set separately

    This PR split the Variable and Gradient Dim in Var_Grad and Weight.
    By this way we can set different Variable Type and Gradient in Wegiht.
    . add dim_g for gradient in WeightSpec.
    . manager need to update to support WeightSpec.
    . Create Tensors according to dim_v and dim_g
    . Create Weight chaged in Weight.h
    
    Resolves:
    
    **Self evaluation:**
    1. Build test:	 [X]Passed [ ]Failed [ ]Skipped
    2. Run test:	 [X]Passed [ ]Failed [ ]Skipped
    
    Signed-off-by: jijoong.moon <[email protected]>
    jijoongmoon committed May 2, 2024
    Configuration menu
    Copy the full SHA
    55a0085 View commit details
    Browse the repository at this point in the history
  4. [ NEURALNET ] change the loss scale property to Rigid Property

    Loss Scale is more like Rigid Property of model, rather than flexible
    property.
    
    Resolves:
    
    **Self evaluation:**
    1. Build test:	 [X]Passed [ ]Failed [ ]Skipped
    2. Run test:	 [X]Passed [ ]Failed [ ]Skipped
    
    Signed-off-by: jijoong.moon <[email protected]>
    jijoongmoon committed May 2, 2024
    Configuration menu
    Copy the full SHA
    d481ce6 View commit details
    Browse the repository at this point in the history

Commits on May 3, 2024

  1. [ Weight ] Add Var32 Tensor in Weight.

    We will add Var32 Tensor if the Variable Weight is not Full
    precision (FP32). This eables the Weight Update with full precision
    and only Apply Gradient Process ueses this Tensor. Therefore, the
    lifespan of this tensor should be "ApplyGradient".
    
    . Modify TensorPool to generate Weigth considering Mixed Precsion.
    
    **Self evaluation:**
    1. Build test:	 [X]Passed [ ]Failed [ ]Skipped
    2. Run test:	 [X]Passed [ ]Failed [ ]Skipped
    
    Signed-off-by: jijoong.moon <[email protected]>
    jijoongmoon committed May 3, 2024
    Configuration menu
    Copy the full SHA
    3ce1962 View commit details
    Browse the repository at this point in the history