-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【prim】add forward output for Silu grad signature #53632
【prim】add forward output for Silu grad signature #53632
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
❌ The PR is not created using PR's template. You can refer to this Demo. |
#define DECLARE_ACTIVATION_GRAD_KERNEL_DEPXOUT(name) \ | ||
template <typename T, typename Context> \ | ||
void name##GradKernel(const Context& dev_ctx, \ | ||
const DenseTensor& x, \ | ||
const DenseTensor& out, \ | ||
const DenseTensor& dout, \ | ||
DenseTensor* dx); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个直接写成正常函数吧,不用写成宏定义的形式
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
@@ -32,6 +32,18 @@ namespace phi { | |||
dev_ctx, &x, nullptr, &dout, dx, functor); \ | |||
} | |||
|
|||
#define DEFINE_CPU_ACTIVATION_GRAD_KERNEL_DEPXOUT(name, functor_class) \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
@@ -86,6 +86,18 @@ void ActivationGradGPUImpl(const Context& dev_ctx, | |||
dev_ctx, &x, nullptr, &dout, dx, functor); \ | |||
} | |||
|
|||
#define DEFINE_GPU_ACTIVATION_GRAD_KERNEL_DEPXOUT(name, functor_class) \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
@@ -49,6 +49,18 @@ void ActivationGradXPUImpl(const Context& dev_ctx, | |||
dev_ctx, &x, nullptr, &dout, dx, functor); \ | |||
} | |||
|
|||
#define DEFINE_XPU_ACTIVATION_GRAD_KERNEL_DEPXOUT(name, functor_class) \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Bug fixes
PR changes
others
Description
pcard-66975
修改silu_grad的yaml配置为silu_grad(x, out, out_grad, x_grad), 增加前向的输出为输入,以避免一阶组合算子在计算sigmod后进行自动微分,使得数值不稳定。