Skip to content

ReLU op #401

Answered by ricardoV94
finncatling asked this question in Q&A
Jul 27, 2023 · 1 comments · 1 reply
Discussion options

You must be logged in to vote

You can create a relu with pytensor.tensor.switch:

import pytensor.tensor as pt

def relu(x):
  return pt.switch(pt.lt(x, 0), 0, x)

No need for a custom Op

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@finncatling
Comment options

Answer selected by finncatling
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants