Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ThresholdedReLU crashes when the input is a list #16273

Closed
maybeLee opened this issue Mar 19, 2022 · 4 comments · Fixed by #16277
Closed

ThresholdedReLU crashes when the input is a list #16273

maybeLee opened this issue Mar 19, 2022 · 4 comments · Fixed by #16277

Comments

@maybeLee
Copy link

System information.

  • Have I written custom code (as opposed to using a stock example script provided in Keras): Yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 2.8.0
  • Python version: 3.7.12
  • Bazel version (if compiling from source): N/A
  • GPU model and memory: N/A
  • Exact command to reproduce: https://colab.research.google.com/drive/144FOk8RO-Ew_eBtGZlCmsUL6k0cEAlUO?usp=sharing

Describe the problem.
keras.layers.ThresholdedReLU fails to accept a list input by reporting the following error:

[/usr/local/lib/python3.7/dist-packages/keras/layers/advanced_activations.py](https://localhost:8080/#) in call(self, inputs)
    262 
    263   def call(self, inputs):
--> 264     theta = tf.cast(self.theta, inputs.dtype)
    265     return inputs * tf.cast(tf.greater(inputs, theta), inputs.dtype)
    266 

AttributeError: Exception encountered when calling layer "thresholded_re_lu_1" (type ThresholdedReLU).

'list' object has no attribute 'dtype'

Call arguments received:
  • inputs=['tf.Tensor(shape=(None, 1, 10), dtype=float32)', 'tf.Tensor(shape=(None, 1, 10), dtype=float32)', 'tf.Tensor(shape=(None, 1, 10), dtype=float32)']

In contrast, keras.layers.ReLU and keras.layers.LeakyReLU can accept the list input.

Describe the current behavior.
keras.layers.ThresholdedReLU crashes when the input is a list

Describe the expected behavior.
ThresholdedReLU can accept the list input.

Contributing.

  • Do you want to contribute a PR? (yes/no):
  • If yes, please read this page for instructions
  • Briefly describe your candidate solution(if contributing):

After comparing the code between ThresholedReLU and ReLU, I think the reason is that ReLU directly use the backend implementation: keras/layers/activation/relu.py#L96 while ThresholdedReLU implements by itself: keras/layers/activation/thresholded_relu.py#L63. Not sure why does such an implementation inconsistency exist, but I think we can do something similar in the thresholded_relu.py#L61-63 like backend.relu does:

def call(self, inputs):
    dtype = getattr(inputs, 'dtype', floatx())
    theta = tf.cast(self.theta, dtype)
    return inputs * tf.cast(tf.greater(inputs, theta), dtype)

Of course, we can also directly use the backend.relu for the implementation of ThresholdedReLU like ReLU and LeakyReLU do.

Standalone code to reproduce the issue.
You can access this link or run the following code:

import keras
x = keras.layers.Input(shape=(1,10))
y = keras.layers.ThresholdedReLU()([x,x,x])
model = keras.models.Model(x,y)
model.summary()
@maybeLee maybeLee changed the title Inconsistent behavior between ThresholdedReLU and ReLU when input is a list ThresholdedReLU crashes when the input is a list Mar 19, 2022
@gadagashwini
Copy link
Contributor

@maybeLee,
keras.layers.ThresholdedReLU accepts input_shape as tuple of integers.

import keras
x = keras.layers.Input(shape=(1,10))
y = keras.layers.ThresholdedReLU()((x))
model = keras.models.Model(x,y)
model.summary()

Output

Model: "model_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_10 (InputLayer)       [(None, 1, 10)]           0         
                                                                 
 thresholded_re_lu_11 (Thres  (None, 1, 10)            0         
 holdedReLU)                                                     
                                                                 
=================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0

@maybeLee
Copy link
Author

@gadagashwini,
Thanks for your reply! I think the problem in this issue is that: ThresholdedReLU fails to accept a list of tensor as input while similar layers such as ReLU and LeakyReLU can accept a list of tensor as input:

Sending a list of tensor to ReLU:

import keras
x = keras.layers.Input(shape=(1,10))
y = keras.layers.ReLU()([x,x,x])
model = keras.models.Model(x,y)
model.summary()
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_17 (InputLayer)          [(None, 1, 10)]      0           []                               
                                                                                                  
 re_lu_7 (ReLU)                 (3, None, 1, 10)     0           ['input_17[0][0]',               
                                                                  'input_17[0][0]',               
                                                                  'input_17[0][0]']               
                                                                                                  
==================================================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0
__________________________________________________________________________________________________

Sending a list of tensor to ThresholdedReLU:

import keras
x = keras.layers.Input(shape=(1,10))
y = keras.layers.ThresholdedReLU()([x,x,x])
model = keras.models.Model(x,y)
model.summary()
[/usr/local/lib/python3.7/dist-packages/keras/layers/advanced_activations.py](https://localhost:8080/#) in call(self, inputs)
    262 
    263   def call(self, inputs):
--> 264     theta = tf.cast(self.theta, inputs.dtype)
    265     return inputs * tf.cast(tf.greater(inputs, theta), inputs.dtype)
    266 

AttributeError: Exception encountered when calling layer "thresholded_re_lu_7" (type ThresholdedReLU).

'list' object has no attribute 'dtype'

Call arguments received:
  • inputs=['tf.Tensor(shape=(None, 1, 10), dtype=float32)', 'tf.Tensor(shape=(None, 1, 10), dtype=float32)', 'tf.Tensor(shape=(None, 1, 10), dtype=float32)']

@google-ml-butler
Copy link

This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.

@maybeLee
Copy link
Author

Hi, I temporarily close this issue since a pr thread has already been created

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants