Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NNVM]Activations support added in Keras Frontend #1210

Merged
merged 3 commits into from
Jun 7, 2018

Conversation

PariksheetPinjari909
Copy link
Contributor

Support for PReLU, ELU, selu, ThresholdedReLU, softsign, hard_sigmoid added

~/A.T.

test_forward_selu()
test_forward_thresholdedrelu()
test_forward_softsign()
test_forward_hardsigmoid()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you merge these many activation layer tests into one test_forward_activation()?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Better to merge these activation layer tests this way to reduce code duplication:

def test_forward_activation():
    data = keras.layers.Input(shape=(32,32,3))
    act_funcs = [keras.layers.Activation('softmax'),
                 keras.layers.Activation('softplus'),
                 keras.layers.LeakyReLU(alpha=0.3),
                 keras.layers.Activation(keras.applications.mobilenet.relu6)]
    for act_func in act_funcs:
        x = act_func(data)
        x = keras.layers.GlobalMaxPooling2D()(x)
        keras_model = keras.models.Model(data, x)
        verify_keras_frontend(keras_model)

_check_data_format(keras_layer)
transposeShape = []
size = len(keras_layer.alpha.shape)
transposeShape.append(size - 1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

transposeShape = np.roll(range(size), 1) will do the same thing

since transposeShape the name is confusing, can use it inline without giving it a name

return -alpha * _sym.relu(1 - _sym.exp(insym)) + _sym.relu(insym)
elif act_type == 'selu':
alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1.6732
gamma = keras_layer.gamma if hasattr(keras_layer, "gamma") else 1.0507
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should refer the original paper for these magic numbers; https://arxiv.org/abs/1706.02515.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kazum : these values are obtained from the same paper you shared.
Did you find any discrepancies?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean we should add a comment to explain those values are from the paper.

act_type = type(keras_layer).__name__
if act_type == 'LeakyReLU':
return _sym.leaky_relu(insym, alpha=keras_layer.alpha)
elif act_type == 'ELU':
raise NotImplementedError('ELU not implemented')
alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1
return -alpha * _sym.relu(1 - _sym.exp(insym)) + _sym.relu(insym)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can define one helper function to calculate ELU and use it for keras.layers.ELU, keras.layers.Activation('elu'), and keras.layers.Activation('selu').

@PariksheetPinjari909
Copy link
Contributor Author

@kazum , @Huyuwei , sorry for late response, i have fixed all your comments now, thanks!

@PariksheetPinjari909
Copy link
Contributor Author

@Huyuwei : your comment is handled now...

@tqchen tqchen merged commit 5455c87 into apache:master Jun 7, 2018
nishi-t pushed a commit to nishi-t/tvm that referenced this pull request Jun 7, 2018
* [NNVM]Activations support added in Keras Frontend

* Helper for ELU added

* All activations test cases clubbed to one
tqchen pushed a commit to tqchen/tvm that referenced this pull request Jul 6, 2018
* [NNVM]Activations support added in Keras Frontend

* Helper for ELU added

* All activations test cases clubbed to one
mnuyens pushed a commit to mnuyens/tvm that referenced this pull request Jul 10, 2018
* [NNVM]Activations support added in Keras Frontend

* Helper for ELU added

* All activations test cases clubbed to one
sergei-mironov pushed a commit to sergei-mironov/tvm that referenced this pull request Aug 8, 2018
* [NNVM]Activations support added in Keras Frontend

* Helper for ELU added

* All activations test cases clubbed to one
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants