-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ThresholdedReLU crashes when the input is a list #16273
Comments
@maybeLee,
Output
|
@gadagashwini, Sending a list of tensor to
Sending a list of tensor to
|
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you. |
Hi, I temporarily close this issue since a pr thread has already been created |
System information.
Describe the problem.
keras.layers.ThresholdedReLU
fails to accept a list input by reporting the following error:In contrast,
keras.layers.ReLU
andkeras.layers.LeakyReLU
can accept the list input.Describe the current behavior.
keras.layers.ThresholdedReLU
crashes when the input is a listDescribe the expected behavior.
ThresholdedReLU can accept the list input.
Contributing.
After comparing the code between
ThresholedReLU
andReLU
, I think the reason is thatReLU
directly use the backend implementation: keras/layers/activation/relu.py#L96 while ThresholdedReLU implements by itself: keras/layers/activation/thresholded_relu.py#L63. Not sure why does such an implementation inconsistency exist, but I think we can do something similar in the thresholded_relu.py#L61-63 like backend.relu does:Of course, we can also directly use the
backend.relu
for the implementation ofThresholdedReLU
likeReLU
andLeakyReLU
do.Standalone code to reproduce the issue.
You can access this link or run the following code:
The text was updated successfully, but these errors were encountered: