Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ROI Pooling Layer #585

Closed
wants to merge 5 commits into from
Closed

Conversation

varunagrawal
Copy link
Contributor

Code for ROI Pooling layer.
(#477)

@fmassa
Copy link
Member

fmassa commented Aug 28, 2018

Hi @varunagrawal ,

Thanks a lot for the PR!
Could you first send the PR to the layers branch? I'd rather not merge this into master yet, as it will add a few dependencies that I'd rather double check throughly.

@varunagrawal
Copy link
Contributor Author

@fmassa I took out the NMS and ROI Align layers in order to minimize the dependencies on this PR and tackle each thing separately. I don't know if this branch would merge with your layers branch.
Any recommendations on what should I do?

@fmassa
Copy link
Member

fmassa commented Aug 30, 2018

When you create a PR, you can select to which branch you want it to send the PR to, see https://help.github.com/articles/creating-a-pull-request/ for some more information.

@fmassa
Copy link
Member

fmassa commented Aug 30, 2018

Just saw your other PR. I think that's the way to go, I'd look into updating the layers branch.
Also, one more thing, what version of PyTorch did you use to compile?
With PyTorch pulled from master yesterday, I'm facing issues such as

/private/home/fmassa/.conda/envs/detectron/lib/python3.6/site-packages/torch/lib/include/THC/THCNumerics.cuh(163): error: more than one operator "<" matches these operands:
            built-in operator "arithmetic < arithmetic"
            function "operator<(const __half &, const __half &)"
            operand types are: at::Half < at::Half

/private/home/fmassa/.conda/envs/detectron/lib/python3.6/site-packages/torch/lib/include/THC/THCNumerics.cuh(167): error: more than one operator "<=" matches these operands:
            built-in operator "arithmetic <= arithmetic"
            function "operator<=(const __half &, const __half &)"
            operand types are: at::Half <= at::Half

/private/home/fmassa/.conda/envs/detectron/lib/python3.6/site-packages/torch/lib/include/THC/THCNumerics.cuh(171): error: more than one operator ">" matches these operands:
            built-in operator "arithmetic > arithmetic"
            function "operator>(const __half &, const __half &)"
            operand types are: at::Half > at::Half

/private/home/fmassa/.conda/envs/detectron/lib/python3.6/site-packages/torch/lib/include/THC/THCNumerics.cuh(175): error: more than one operator ">=" matches these operands:
            built-in operator "arithmetic >= arithmetic"
            function "operator>=(const __half &, const __half &)"
            operand types are: at::Half >= at::Half

@varunagrawal
Copy link
Contributor Author

varunagrawal commented Aug 30, 2018

@fmassa so do you want me to just base off the original layers branch (without the updates from master) and update ROI pooling accordingly? Just want to make sure I understand your last comment correctly.

Also, regarding that compilation error, it is a regression introduced by pytorch/pytorch#10301. Syed should have a fix in by next week.

@fmassa
Copy link
Member

fmassa commented Aug 30, 2018

Oh yes, I had seen the diff but I didn't check the discussion in the PR.

I think it's easier to base the diff off from the layers branch, and then I separately rebase layers on top of master in a follow-up commit. what do you think?

@varunagrawal
Copy link
Contributor Author

That should work. I'll force-update the PR to align with the layers branch.

@varunagrawal
Copy link
Contributor Author

varunagrawal commented Aug 30, 2018

@fmassa it is done. Please refer to #592.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants