Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: len(batch) == batch.num_graphs #4931

Merged
merged 6 commits into from
Jul 7, 2022
Merged

Fix: len(batch) == batch.num_graphs #4931

merged 6 commits into from
Jul 7, 2022

Conversation

rusty1s
Copy link
Member

@rusty1s rusty1s commented Jul 7, 2022

No description provided.

@rusty1s rusty1s linked an issue Jul 7, 2022 that may be closed by this pull request
@rusty1s rusty1s self-assigned this Jul 7, 2022
@codecov
Copy link

codecov bot commented Jul 7, 2022

Codecov Report

Merging #4931 (32db0d0) into master (5f77394) will increase coverage by 0.01%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #4931      +/-   ##
==========================================
+ Coverage   82.68%   82.70%   +0.01%     
==========================================
  Files         330      330              
  Lines       17844    17850       +6     
==========================================
+ Hits        14755    14763       +8     
+ Misses       3089     3087       -2     
Impacted Files Coverage Δ
torch_geometric/data/batch.py 96.29% <100.00%> (+2.62%) ⬆️
torch_geometric/data/collate.py 96.50% <100.00%> (+0.10%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5f77394...32db0d0. Read the comment docs.

@rusty1s rusty1s merged commit 843ba6c into master Jul 7, 2022
@rusty1s rusty1s deleted the fix_len_batch branch July 7, 2022 09:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

batch length seems incorrect
1 participant