-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[NNVM]Activations support added in Keras Frontend #1210
Conversation
test_forward_selu() | ||
test_forward_thresholdedrelu() | ||
test_forward_softsign() | ||
test_forward_hardsigmoid() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you merge these many activation layer tests into one test_forward_activation()?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Better to merge these activation layer tests this way to reduce code duplication:
def test_forward_activation():
data = keras.layers.Input(shape=(32,32,3))
act_funcs = [keras.layers.Activation('softmax'),
keras.layers.Activation('softplus'),
keras.layers.LeakyReLU(alpha=0.3),
keras.layers.Activation(keras.applications.mobilenet.relu6)]
for act_func in act_funcs:
x = act_func(data)
x = keras.layers.GlobalMaxPooling2D()(x)
keras_model = keras.models.Model(data, x)
verify_keras_frontend(keras_model)
nnvm/python/nnvm/frontend/keras.py
Outdated
_check_data_format(keras_layer) | ||
transposeShape = [] | ||
size = len(keras_layer.alpha.shape) | ||
transposeShape.append(size - 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
transposeShape = np.roll(range(size), 1) will do the same thing
since transposeShape the name is confusing, can use it inline without giving it a name
nnvm/python/nnvm/frontend/keras.py
Outdated
return -alpha * _sym.relu(1 - _sym.exp(insym)) + _sym.relu(insym) | ||
elif act_type == 'selu': | ||
alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1.6732 | ||
gamma = keras_layer.gamma if hasattr(keras_layer, "gamma") else 1.0507 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should refer the original paper for these magic numbers; https://arxiv.org/abs/1706.02515.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kazum : these values are obtained from the same paper you shared.
Did you find any discrepancies?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean we should add a comment to explain those values are from the paper.
nnvm/python/nnvm/frontend/keras.py
Outdated
act_type = type(keras_layer).__name__ | ||
if act_type == 'LeakyReLU': | ||
return _sym.leaky_relu(insym, alpha=keras_layer.alpha) | ||
elif act_type == 'ELU': | ||
raise NotImplementedError('ELU not implemented') | ||
alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1 | ||
return -alpha * _sym.relu(1 - _sym.exp(insym)) + _sym.relu(insym) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can define one helper function to calculate ELU and use it for keras.layers.ELU, keras.layers.Activation('elu'), and keras.layers.Activation('selu').
2690b38
to
95139c2
Compare
95139c2
to
20c50b0
Compare
@Huyuwei : your comment is handled now... |
* [NNVM]Activations support added in Keras Frontend * Helper for ELU added * All activations test cases clubbed to one
* [NNVM]Activations support added in Keras Frontend * Helper for ELU added * All activations test cases clubbed to one
* [NNVM]Activations support added in Keras Frontend * Helper for ELU added * All activations test cases clubbed to one
* [NNVM]Activations support added in Keras Frontend * Helper for ELU added * All activations test cases clubbed to one
Support for PReLU, ELU, selu, ThresholdedReLU, softsign, hard_sigmoid added
~/A.T.