Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Typing][B-42][BUAA] Add type annotations for python/paddle/autograd/py_layer.py #66328

Merged
merged 8 commits into from
Jul 26, 2024

Conversation

Fripping
Copy link
Contributor

PR Category

User Experience

PR Types

Improvements

Description

类型标注:

python/paddle/autograd/py_layer.py

Related links

#65008
@SigureMo @megemini

Copy link

paddle-bot bot commented Jul 22, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the contributor External developers label Jul 22, 2024
@luotao1 luotao1 added the HappyOpenSource Pro 进阶版快乐开源活动,更具挑战性的任务 label Jul 22, 2024
@Fripping
Copy link
Contributor Author

@megemini @SigureMo 请求review

@@ -52,7 +59,7 @@ class PyLayerContext:
... return grad
"""

def save_for_backward(self, *tensors):
def save_for_backward(self, *tensors: list[Tensor]) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def save_for_backward(self, *tensors: list[Tensor]) -> None:
def save_for_backward(self, *tensors: Tensor) -> None:

@@ -90,7 +97,7 @@ def save_for_backward(self, *tensors):
"""
self.container = tensors

def saved_tensor(self):
def saved_tensor(self) -> list[Tensor] | None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def saved_tensor(self) -> list[Tensor] | None:
def saved_tensor(self) -> tuple[Tensor, ...]:

@@ -122,7 +129,7 @@ def saved_tensor(self):
"""
return self.container

def mark_not_inplace(self, *args):
def mark_not_inplace(self, *args: tuple) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def mark_not_inplace(self, *args: tuple) -> None:
def mark_not_inplace(self, *args: Tensor) -> None:

@@ -163,7 +170,7 @@ def mark_not_inplace(self, *args):
"""
self.not_inplace_tensors = args

def mark_non_differentiable(self, *args):
def mark_non_differentiable(self, *args: tuple) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def mark_non_differentiable(self, *args: tuple) -> None:
def mark_non_differentiable(self, *args: Tensor) -> None:

@@ -203,7 +210,7 @@ def mark_non_differentiable(self, *args):
"""
self.non_differentiable = args

def set_materialize_grads(self, value: bool):
def set_materialize_grads(self, value: bool = True) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def set_materialize_grads(self, value: bool = True) -> None:
def set_materialize_grads(self, value: bool) -> None:

不要随便增加或修改默认值 ~

@@ -322,7 +329,7 @@ class PyLayer(with_metaclass(PyLayerMeta, core.eager.PyLayer, PyLayerContext)):
"""

@staticmethod
def forward(ctx, *args, **kwargs):
def forward(ctx, *args: tuple, **kwargs: dict) -> Tensor | list[Tensor]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def forward(ctx, *args: tuple, **kwargs: dict) -> Tensor | list[Tensor]:
def forward(ctx: PyLayerContext, *args: Any, **kwargs: Any) -> Tensor | Sequence[Tensor]:

@@ -361,7 +368,7 @@ def forward(ctx, *args, **kwargs):
)

@staticmethod
def backward(ctx, *args):
def backward(ctx, *args: tuple) -> Tensor | list[Tensor]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def backward(ctx, *args: tuple) -> Tensor | list[Tensor]:
def backward(ctx: PyLayerContext, *args: Any) -> Tensor | Sequence[Tensor]:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

收到,感谢

@SigureMo SigureMo changed the title [Typing][B-42][BUAA] Add type annotations for python/paddle/autograd/py_layer.py [Typing][B-42][BUAA] Add type annotations for python/paddle/autograd/py_layer.py Jul 23, 2024
@SigureMo
Copy link
Member

有问题可以说,为什么把我的修改覆盖了?

@Fripping
Copy link
Contributor Author

有问题可以说,为什么把我的修改覆盖了?

抱歉抱歉,是我对git操作还不够熟练,可能无意中给覆盖了,下次一定注意

Copy link
Member

@SigureMo SigureMo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTMeow 🐾

@Fripping
Copy link
Contributor Author

LGTMeow 🐾

收到,感谢

@luotao1 luotao1 merged commit e18ce27 into PaddlePaddle:develop Jul 26, 2024
31 checks passed
Dale1314 pushed a commit to Dale1314/Paddle that referenced this pull request Jul 28, 2024
inaomIIsfarell pushed a commit to inaomIIsfarell/Paddle that referenced this pull request Jul 31, 2024
lixcli pushed a commit to lixcli/Paddle that referenced this pull request Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers HappyOpenSource Pro 进阶版快乐开源活动,更具挑战性的任务
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants