Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

For those who are new to the installation errors #144

Closed
rohit7044 opened this issue Mar 15, 2023 · 4 comments
Closed

For those who are new to the installation errors #144

rohit7044 opened this issue Mar 15, 2023 · 4 comments

Comments

@rohit7044
Copy link

rohit7044 commented Mar 15, 2023

If you're using RTX 20+ series cards and Cuda 11+

  • Please clone this repository

  • Make a small change in rasterize_kernel_cu.cpp line 8

#if CUDA_ARCH < 600 and defined(__CUDA_ARCH__)

//something

#endif

to

#if !defined(__CUDA_ARCH__) || __CUDA_ARCH__ >= 600
#else
static __inline__ __device__ double atomicAdd(double* address, double val) {
    unsigned long long int* address_as_ull = (unsigned long long int*)address;
    unsigned long long int old = *address_as_ull, assumed;
    do {
        assumed = old;
        old = atomicCAS(address_as_ull, assumed,
                __double_as_longlong(val + __longlong_as_double(assumed)));
    // Note: uses integer comparison to avoid hang in case of NaN (since NaN != NaN) } while (assumed != old);
    } while (assumed != old);
    return __longlong_as_double(old);
}
#endif  

This will solve nvcc call errors. Taken from #62

@MilesTheProwler
Copy link

I already edit that part but still have error.

running install
running bdist_egg
running egg_info
writing neural_renderer_pytorch.egg-info/PKG-INFO
writing dependency_links to neural_renderer_pytorch.egg-info/dependency_links.txt
writing top-level names to neural_renderer_pytorch.egg-info/top_level.txt
reading manifest file 'neural_renderer_pytorch.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file 'neural_renderer_pytorch.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
running build_ext
building 'neural_renderer.cuda.load_textures' extension
Emitting ninja build file /home/ninja/HF-Avatar/thirdparty/neural_renderer_pytorch/build/temp.linux-x86_64-3.6/build.ninja...
Compiling objects...
Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N)
[1/1] /usr/bin/nvcc -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/TH -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/THC -I/home/ninja/miniconda3/envs/Avatar/include/python3.6m -c -c /home/ninja/HF-Avatar/thirdparty/neural_renderer_pytorch/neural_renderer/cuda/load_textures_cuda_kernel.cu -o /home/ninja/HF-Avatar/thirdparty/neural_renderer_pytorch/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda_kernel.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options ''"'"'-fPIC'"'"'' -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=load_textures -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=sm_86 -std=c++14
FAILED: /home/ninja/HF-Avatar/thirdparty/neural_renderer_pytorch/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda_kernel.o
/usr/bin/nvcc -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/TH -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/THC -I/home/ninja/miniconda3/envs/Avatar/include/python3.6m -c -c /home/ninja/HF-Avatar/thirdparty/neural_renderer_pytorch/neural_renderer/cuda/load_textures_cuda_kernel.cu -o /home/ninja/HF-Avatar/thirdparty/neural_renderer_pytorch/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda_kernel.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options ''"'"'-fPIC'"'"'' -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=load_textures -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=sm_86 -std=c++14/usr/include/c++/11/bits/std_function.h:435:145: error: parameter packs not expanded with ‘...’:
  435 |         function(_Functor&& __f)
      |                                                                                                                                                 ^
/usr/include/c++/11/bits/std_function.h:435:145: note:         ‘_ArgTypes’
/usr/include/c++/11/bits/std_function.h:530:146: error: parameter packs not expanded with ‘...’:
  530 |         operator=(_Functor&& __f)
      |                                                                                                                                                  ^
/usr/include/c++/11/bits/std_function.h:530:146: note:         ‘_ArgTypes’
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 1522, in _run_ninja_build
    env=env)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "setup.py", line 40, in <module>
    cmdclass = {'build_ext': BuildExtension}
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/__init__.py", line 153, in setup
    return distutils.core.setup(**attrs)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/install.py", line 67, in run
    self.do_egg_install()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/install.py", line 109, in do_egg_install
    self.run_command('bdist_egg')
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/bdist_egg.py", line 164, in run
    cmd = self.call_command('install_lib', warn_dir=0)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/bdist_egg.py", line 150, in call_command
    self.run_command(cmdname)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/install_lib.py", line 11, in run
    self.build()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/install_lib.py", line 107, in build
    self.run_command('build_ext')
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/build_ext.py", line 79, in run    _build_ext.run(self)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 339, in run
    self.build_extensions()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 653, in build_extensions
    build_ext.build_extensions(self)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 448, in build_extensions
    self._build_extensions_serial()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 473, in _build_extensions_serial
    self.build_extension(ext)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/build_ext.py", line 202, in build_extension
    _build_ext.build_extension(self, ext)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 533, in build_extension
    depends=ext.depends)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 482, in unix_wrap_ninja_compile
    with_cuda=with_cuda)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 1238, in _write_ninja_file_and_compile_objects
    error_prefix='Error compiling objects for extension')
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 1538, in _run_ninja_build
    raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension

@rohit7044
Copy link
Author

In the first comment I mentioned to clone this repository and make changes accordingly. Did you tried that?
The steps I mentioned will only work on the mentioned repository. Have a look

@MilesTheProwler
Copy link

Yes, I followed every steps that you mentioned but it still fail. My OS is Linux 20 and python 3.6, pytorch=1.7.0 and cudatoolkit=11.0

running install
running bdist_egg
running egg_info
writing neural_renderer_pytorch.egg-info/PKG-INFO
writing dependency_links to neural_renderer_pytorch.egg-info/dependency_links.txt
writing top-level names to neural_renderer_pytorch.egg-info/top_level.txt
reading manifest file 'neural_renderer_pytorch.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file 'neural_renderer_pytorch.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build/lib.linux-x86_64-3.6
creating build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/__init__.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/perspective.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/vertices_to_faces.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/rasterize.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/projection.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/lighting.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/get_points_from_angles.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/look.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/look_at.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/save_obj.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/load_obj.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/mesh.py -> build/lib.linux-x86_64-3.6/neural_renderer
copying neural_renderer/renderer.py -> build/lib.linux-x86_64-3.6/neural_renderer
creating build/lib.linux-x86_64-3.6/neural_renderer/cuda
copying neural_renderer/cuda/__init__.py -> build/lib.linux-x86_64-3.6/neural_renderer/cuda
running build_ext
building 'neural_renderer.cuda.load_textures' extension
creating /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6
creating /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer
creating /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer/cuda
Emitting ninja build file /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/build.ninja...
Compiling objects...
Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N)
[1/2] /usr/bin/nvcc -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/TH -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/THC -I/home/ninja/miniconda3/envs/Avatar/include/python3.6m -c -c /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda_kernel.cu -o /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda_kernel.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options ''"'"'-fPIC'"'"'' -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=load_textures -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=sm_86 -std=c++14
FAILED: /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda_kernel.o
/usr/bin/nvcc -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/TH -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/THC -I/home/ninja/miniconda3/envs/Avatar/include/python3.6m -c -c /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda_kernel.cu -o /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda_kernel.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options ''"'"'-fPIC'"'"'' -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=load_textures -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=sm_86 -std=c++14
/usr/include/c++/11/bits/std_function.h:435:145: error: parameter packs not expanded with ‘...’:
  435 |         function(_Functor&& __f)
      |                                                                                                                                                 ^
/usr/include/c++/11/bits/std_function.h:435:145: note:         ‘_ArgTypes’
/usr/include/c++/11/bits/std_function.h:530:146: error: parameter packs not expanded with ‘...’:
  530 |         operator=(_Functor&& __f)
      |                                                                                                                                                  ^
/usr/include/c++/11/bits/std_function.h:530:146: note:         ‘_ArgTypes’
[2/2] c++ -MMD -MF /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda.o.d -pthread -B /home/ninja/miniconda3/envs/Avatar/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/TH -I/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/THC -I/home/ninja/miniconda3/envs/Avatar/include/python3.6m -c -c /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp -o /home/ninja/neural_renderer/build/temp.linux-x86_64-3.6/neural_renderer/cuda/load_textures_cuda.o -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=load_textures -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14
cc1plus: warning: command-line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Parallel.h:149,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/utils.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/nn/cloneable.h:5,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/nn.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:12,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ParallelOpenMP.h:84: warning: ignoring ‘#pragma omp parallel’ [-Wunknown-pragmas]
   84 | #pragma omp parallel for if ((end - begin) >= grain_size)
      |
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/DeviceType.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Device.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Allocator.h:6,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:7,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp: In function ‘at::Tensor load_textures(at::Tensor, at::Tensor, at::Tensor, at::Tensor, int, int)’:
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:41: warning: ‘at::DeprecatedTypeProperties& at::Tensor::type() const’ is deprecated: Tensor.type() is deprecated. Instead use Tensor.options(), which in many cases (e.g. in a constructor) is a drop-in replacement. If you were using data from type(), that is now available from Tensor itself, so instead of tensor.type().scalar_type(), use tensor.scalar_type() instead and instead of tensor.type().backend() use tensor.device(). [-Wdeprecated-declarations]
   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                                         ^
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/macros/Macros.h:171:65: note: in definition of macro ‘C10_UNLIKELY’
  171 | #define C10_UNLIKELY(expr)  (__builtin_expect(static_cast<bool>(expr), 0))
      |                                                                 ^~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:330:7: note: in expansion of macro ‘C10_UNLIKELY_OR_CONST’
  330 |   if (C10_UNLIKELY_OR_CONST(!(cond))) {                               \
      |       ^~~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:318:3: note: in expansion of macro ‘TORCH_CHECK_WITH_MSG’
  318 |   TORCH_CHECK_WITH_MSG(error_t, cond, "", __VA_ARGS__)
      |   ^~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:341:32: note: in expansion of macro ‘TORCH_CHECK_WITH’
  341 | #define TORCH_CHECK(cond, ...) TORCH_CHECK_WITH(Error, cond, __VA_ARGS__)
      |                                ^~~~~~~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:23: note: in expansion of macro ‘TORCH_CHECK’   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                       ^~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:17:24: note: in expansion of macro ‘CHECK_CUDA’
   17 | #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
      |                        ^~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:28:5: note: in expansion of macro ‘CHECK_INPUT’
   28 |     CHECK_INPUT(image);
      |     ^~~~~~~~~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Tensor.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Context.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:9,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/core/TensorBody.h:277:30: note: declared here
  277 |   DeprecatedTypeProperties & type() const {
      |                              ^~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/DeviceType.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Device.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Allocator.h:6,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:7,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:41: warning: ‘at::DeprecatedTypeProperties& at::Tensor::type() const’ is deprecated: Tensor.type() is deprecated. Instead use Tensor.options(), which in many cases (e.g. in a constructor) is a drop-in replacement. If you were using data from type(), that is now available from Tensor itself, so instead of tensor.type().scalar_type(), use tensor.scalar_type() instead and instead of tensor.type().backend() use tensor.device(). [-Wdeprecated-declarations]
   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                                         ^
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/macros/Macros.h:171:65: note: in definition of macro ‘C10_UNLIKELY’
  171 | #define C10_UNLIKELY(expr)  (__builtin_expect(static_cast<bool>(expr), 0))
      |                                                                 ^~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:330:7: note: in expansion of macro ‘C10_UNLIKELY_OR_CONST’
  330 |   if (C10_UNLIKELY_OR_CONST(!(cond))) {                               \
      |       ^~~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:318:3: note: in expansion of macro ‘TORCH_CHECK_WITH_MSG’
  318 |   TORCH_CHECK_WITH_MSG(error_t, cond, "", __VA_ARGS__)
      |   ^~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:341:32: note: in expansion of macro ‘TORCH_CHECK_WITH’
  341 | #define TORCH_CHECK(cond, ...) TORCH_CHECK_WITH(Error, cond, __VA_ARGS__)
      |                                ^~~~~~~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:23: note: in expansion of macro ‘TORCH_CHECK’   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                       ^~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:17:24: note: in expansion of macro ‘CHECK_CUDA’
   17 | #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
      |                        ^~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:29:5: note: in expansion of macro ‘CHECK_INPUT’
   29 |     CHECK_INPUT(faces);
      |     ^~~~~~~~~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Tensor.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Context.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:9,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/core/TensorBody.h:277:30: note: declared here
  277 |   DeprecatedTypeProperties & type() const {
      |                              ^~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/DeviceType.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Device.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Allocator.h:6,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:7,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:41: warning: ‘at::DeprecatedTypeProperties& at::Tensor::type() const’ is deprecated: Tensor.type() is deprecated. Instead use Tensor.options(), which in many cases (e.g. in a constructor) is a drop-in replacement. If you were using data from type(), that is now available from Tensor itself, so instead of tensor.type().scalar_type(), use tensor.scalar_type() instead and instead of tensor.type().backend() use tensor.device(). [-Wdeprecated-declarations]
   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                                         ^
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/macros/Macros.h:171:65: note: in definition of macro ‘C10_UNLIKELY’
  171 | #define C10_UNLIKELY(expr)  (__builtin_expect(static_cast<bool>(expr), 0))
      |                                                                 ^~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:330:7: note: in expansion of macro ‘C10_UNLIKELY_OR_CONST’
  330 |   if (C10_UNLIKELY_OR_CONST(!(cond))) {                               \
      |       ^~~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:318:3: note: in expansion of macro ‘TORCH_CHECK_WITH_MSG’
  318 |   TORCH_CHECK_WITH_MSG(error_t, cond, "", __VA_ARGS__)
      |   ^~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:341:32: note: in expansion of macro ‘TORCH_CHECK_WITH’
  341 | #define TORCH_CHECK(cond, ...) TORCH_CHECK_WITH(Error, cond, __VA_ARGS__)
      |                                ^~~~~~~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:23: note: in expansion of macro ‘TORCH_CHECK’   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                       ^~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:17:24: note: in expansion of macro ‘CHECK_CUDA’
   17 | #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
      |                        ^~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:30:5: note: in expansion of macro ‘CHECK_INPUT’
   30 |     CHECK_INPUT(is_update);
      |     ^~~~~~~~~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Tensor.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Context.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:9,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/core/TensorBody.h:277:30: note: declared here
  277 |   DeprecatedTypeProperties & type() const {
      |                              ^~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/DeviceType.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Device.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/core/Allocator.h:6,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:7,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:41: warning: ‘at::DeprecatedTypeProperties& at::Tensor::type() const’ is deprecated: Tensor.type() is deprecated. Instead use Tensor.options(), which in many cases (e.g. in a constructor) is a drop-in replacement. If you were using data from type(), that is now available from Tensor itself, so instead of tensor.type().scalar_type(), use tensor.scalar_type() instead and instead of tensor.type().backend() use tensor.device(). [-Wdeprecated-declarations]
   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                                         ^
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/macros/Macros.h:171:65: note: in definition of macro ‘C10_UNLIKELY’
  171 | #define C10_UNLIKELY(expr)  (__builtin_expect(static_cast<bool>(expr), 0))
      |                                                                 ^~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:330:7: note: in expansion of macro ‘C10_UNLIKELY_OR_CONST’
  330 |   if (C10_UNLIKELY_OR_CONST(!(cond))) {                               \
      |       ^~~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:318:3: note: in expansion of macro ‘TORCH_CHECK_WITH_MSG’
  318 |   TORCH_CHECK_WITH_MSG(error_t, cond, "", __VA_ARGS__)
      |   ^~~~~~~~~~~~~~~~~~~~
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/c10/util/Exception.h:341:32: note: in expansion of macro ‘TORCH_CHECK_WITH’
  341 | #define TORCH_CHECK(cond, ...) TORCH_CHECK_WITH(Error, cond, __VA_ARGS__)
      |                                ^~~~~~~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:15:23: note: in expansion of macro ‘TORCH_CHECK’   15 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
      |                       ^~~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:17:24: note: in expansion of macro ‘CHECK_CUDA’
   17 | #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
      |                        ^~~~~~~~~~
/home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:31:5: note: in expansion of macro ‘CHECK_INPUT’
   31 |     CHECK_INPUT(textures);
      |     ^~~~~~~~~~~
In file included from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Tensor.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/Context.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/ATen.h:9,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/types.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
                 from /home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/torch/csrc/api/include/torch/torch.h:3,
                 from /home/ninja/neural_renderer/neural_renderer/cuda/load_textures_cuda.cpp:1:
/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/include/ATen/core/TensorBody.h:277:30: note: declared here
  277 |   DeprecatedTypeProperties & type() const {
      |                              ^~~~
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 1522, in _run_ninja_build
    env=env)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "setup.py", line 40, in <module>
    cmdclass = {'build_ext': BuildExtension}
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/__init__.py", line 153, in setup
    return distutils.core.setup(**attrs)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/install.py", line 67, in run
    self.do_egg_install()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/install.py", line 109, in do_egg_install
    self.run_command('bdist_egg')
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/bdist_egg.py", line 164, in run
    cmd = self.call_command('install_lib', warn_dir=0)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/bdist_egg.py", line 150, in call_command
    self.run_command(cmdname)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/install_lib.py", line 11, in run
    self.build()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/install_lib.py", line 107, in build
    self.run_command('build_ext')
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/build_ext.py", line 79, in run    _build_ext.run(self)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 339, in run
    self.build_extensions()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 653, in build_extensions
    build_ext.build_extensions(self)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 448, in build_extensions
    self._build_extensions_serial()
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 473, in _build_extensions_serial
    self.build_extension(ext)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/setuptools/command/build_ext.py", line 202, in build_extension
    _build_ext.build_extension(self, ext)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/distutils/command/build_ext.py", line 533, in build_extension
    depends=ext.depends)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 482, in unix_wrap_ninja_compile
    with_cuda=with_cuda)
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 1238, in _write_ninja_file_and_compile_objects
    error_prefix='Error compiling objects for extension')
  File "/home/ninja/miniconda3/envs/Avatar/lib/python3.6/site-packages/torch/utils/cpp_extension.py", line 1538, in _run_ninja_build
    raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension

@rohit7044
Copy link
Author

Hi, as it was mentioned in the repo introduction. The code is not stable for Python 3+. Try downgrading the version according to the requirement, and give it a try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants