Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

NMS CPU extension does not compile with pytorch 1.0.1 / nightly #601

Closed
miguelvr opened this issue Mar 25, 2019 · 5 comments
Closed

NMS CPU extension does not compile with pytorch 1.0.1 / nightly #601

miguelvr opened this issue Mar 25, 2019 · 5 comments

Comments

@miguelvr
Copy link
Contributor

🐛 Bug

A compilation error is thrown when building with the latest version of pytorch or pytorch-nightly

To Reproduce

nvidia-docker build -t maskrcnn-benchmark docker/

Expected behavior

maskrcnn-benchmark should build correctly

Environment

  • Docker

Additional context

building 'maskrcnn_benchmark._C' extension
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/maskrcnn-benchmark
creating build/temp.linux-x86_64-3.6/maskrcnn-benchmark/maskrcnn_benchmark
creating build/temp.linux-x86_64-3.6/maskrcnn-benchmark/maskrcnn_benchmark/csrc
creating build/temp.linux-x86_64-3.6/maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu
creating build/temp.linux-x86_64-3.6/maskrcnn-benchmark/maskrcnn_benchmark/csrc/cuda
gcc -pthread -B /miniconda/envs/py36/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DWITH_CUDA -I/maskrcnn-benchmark/maskrcnn_benchmark/csrc -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/TH -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/THC -I/usr/local/cuda/include -I/miniconda/envs/py36/include/python3.6m -c /maskrcnn-benchmark/maskrcnn_benchmark/csrc/vision.cpp -o build/temp.linux-x86_64-3.6/maskrcnn-benchmark/maskrcnn_benchmark/csrc/vision.o -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_C -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++11
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for C/ObjC but not for C++
gcc -pthread -B /miniconda/envs/py36/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DWITH_CUDA -I/maskrcnn-benchmark/maskrcnn_benchmark/csrc -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/TH -I/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/THC -I/usr/local/cuda/include -I/miniconda/envs/py36/include/python3.6m -c /maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu/nms_cpu.cpp -o build/temp.linux-x86_64-3.6/maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu/nms_cpu.o -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_C -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++11
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for C/ObjC but not for C++
In file included from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/ATen/ATen.h:9:0,
                 from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include/torch/types.h:3,
                 from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include/torch/data.h:3,
                 from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/csrc/api/include/torch/all.h:4,
                 from /miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/torch/extension.h:4,
                 from /maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu/vision.h:3,
                 from /maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu/nms_cpu.cpp:2:
/maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu/nms_cpu.cpp: In lambda function:
/maskrcnn-benchmark/maskrcnn_benchmark/csrc/cpu/nms_cpu.cpp:71:46: error: invalid initialization of reference of type 'const at::Type&' from expression of type 'c10::ScalarType'
   AT_DISPATCH_FLOATING_TYPES(dets.scalar_type(), "nms", [&] {
                                              ^
/miniconda/envs/py36/lib/python3.6/site-packages/torch-1.0.1.post2-py3.6-linux-x86_64.egg/torch/lib/include/ATen/Dispatch.h:15:32: note: in definition of macro 'AT_DISPATCH_FLOATING_TYPES'
     const at::Type& the_type = TYPE;                                         \
                                ^
error: command 'gcc' failed with exit status 1

@LeviViana
Copy link
Contributor

LeviViana commented Mar 25, 2019

Sorry, I read too quickly. The torch-1.0.1.post2 version is too old. The nightly version should be newer than the PyTorch's PR that brought this change.

@LeviViana LeviViana added duplicate This issue or pull request already exists and removed duplicate This issue or pull request already exists labels Mar 25, 2019
@miguelvr
Copy link
Contributor Author

isn't it a better solution to fix the NMS extension with the correct type?

@miguelvr
Copy link
Contributor Author

@LeviViana the pytorch nightly from conda is not working as well

@fmassa
Copy link
Contributor

fmassa commented Mar 26, 2019

@miguelvr there was a breaking change in PyTorch that made the extensions only work on pre 1.0.1 or on the nightly. There was no way of making it work for both.

This has now been fixed in PyTorch, so I believe #555 could be reverted.

Can you try locally reverting this patch and see if it works with PyTorch nightly?

@miguelvr
Copy link
Contributor Author

@fmassa will do! thanks

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants