Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add spmm and is_torch_sparse_tensor #5906

Merged
merged 5 commits into from
Nov 9, 2022
Merged

Add spmm and is_torch_sparse_tensor #5906

merged 5 commits into from
Nov 9, 2022

Conversation

EdisonLeeeee
Copy link
Contributor

@EdisonLeeeee EdisonLeeeee commented Nov 5, 2022

This PR implements:

  • spmm function for both PyTorch and torch_sparse SparseTensor
  • is_torch_sparse_tensor function to check if a tensor is a PyTorch SparseTensor

TODO:

  • Add test and docstring for spmm

Any comments would be appreciated :)

@codecov
Copy link

codecov bot commented Nov 5, 2022

Codecov Report

Merging #5906 (11727ce) into master (fc5c255) will decrease coverage by 1.99%.
The diff coverage is 58.82%.

❗ Current head 11727ce differs from pull request most recent head f76caa5. Consider uploading reports for the commit f76caa5 to get more accurate results

@@            Coverage Diff             @@
##           master    #5906      +/-   ##
==========================================
- Coverage   86.41%   84.41%   -2.00%     
==========================================
  Files         349      351       +2     
  Lines       19507    19518      +11     
==========================================
- Hits        16857    16477     -380     
- Misses       2650     3041     +391     
Impacted Files Coverage Δ
torch_geometric/nn/functional/spmm.py 41.66% <41.66%> (ø)
torch_geometric/nn/functional/__init__.py 100.00% <100.00%> (ø)
torch_geometric/utils/torch_sparse_tensor.py 100.00% <100.00%> (ø)
torch_geometric/nn/models/dimenet_utils.py 0.00% <0.00%> (-75.52%) ⬇️
torch_geometric/nn/models/dimenet.py 14.90% <0.00%> (-52.76%) ⬇️
torch_geometric/profile/profile.py 36.73% <0.00%> (-27.56%) ⬇️
torch_geometric/nn/conv/utils/typing.py 81.25% <0.00%> (-17.50%) ⬇️
torch_geometric/nn/pool/asap.py 92.10% <0.00%> (-7.90%) ⬇️
torch_geometric/nn/inits.py 67.85% <0.00%> (-7.15%) ⬇️
torch_geometric/nn/dense/linear.py 87.40% <0.00%> (-5.93%) ⬇️
... and 16 more

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

torch_geometric/nn/functional/__init__.py Outdated Show resolved Hide resolved
torch_geometric/nn/functional/spmm.py Outdated Show resolved Hide resolved
from torch_geometric.utils import is_torch_sparse_tensor


def spmm(src: SparseTensor, other: Tensor, reduce: str = "sum") -> Tensor:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make this TorchScript compatible via the overload decorator?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure :)

torch_geometric/utils/torch_sparse_tensor.py Outdated Show resolved Hide resolved
@github-actions github-actions bot removed the nn label Nov 8, 2022
@rusty1s rusty1s enabled auto-merge (squash) November 9, 2022 13:43
@rusty1s rusty1s merged commit 55bcfc4 into master Nov 9, 2022
@rusty1s rusty1s deleted the spmm branch November 9, 2022 13:46
@EdisonLeeeee
Copy link
Contributor Author

Thanks @rusty1s for the updates and making it merged :)

JakubPietrakIntel pushed a commit to JakubPietrakIntel/pytorch_geometric that referenced this pull request Nov 25, 2022
This PR implements:
+ `spmm` function for both PyTorch and torch_sparse SparseTensor
+ `is_torch_sparse_tensor` function to check if a tensor is a PyTorch
SparseTensor

TODO:
+ Add test and docstring for `spmm`

Any comments would be appreciated :)

Co-authored-by: Matthias Fey <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants