ReLU op #401
Answered
by
ricardoV94
finncatling
asked this question in
Q&A
ReLU op
#401
-
Hi, thanks for your work on this library. I would like to use a ReLU op within my pymc model. I can't find one within pytensor since the removal of Are there any plans to add this functionality back to pytensor, please? Thanks again. |
Beta Was this translation helpful? Give feedback.
Answered by
ricardoV94
Jul 27, 2023
Replies: 1 comment 1 reply
-
You can create a import pytensor.tensor as pt
def relu(x):
return pt.switch(pt.lt(x, 0), 0, x) No need for a custom Op |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
finncatling
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
You can create a
relu
withpytensor.tensor.switch
:No need for a custom Op