Skip to content
This repository has been archived by the owner on Nov 1, 2023. It is now read-only.

Do you have an example under windows? #3

Open
yeyuxmf opened this issue Oct 11, 2019 · 17 comments
Open

Do you have an example under windows? #3

yeyuxmf opened this issue Oct 11, 2019 · 17 comments

Comments

@yeyuxmf
Copy link

yeyuxmf commented Oct 11, 2019

The custom layer uses JIT mechanism to invoke on the c++ side, which seems to require registration in scripts, such as:
.# include <torch/script.h>
Torch:: Tensor warp_perspective (torch:: Tensor image, torch:: Tensor warp){
Torch:: Tensor output = torch:: add (image, warp);
Return output. clone ();
}
Static auto registry=
Torch:: jit:: Register Operators ("my_ops:: warp_perspective", & warp_perspective);

This is the application under Ubuntu. Do you have an example under windows?I don't know how to run the example provided in extension-script under Windows.

@soumith
Copy link
Member

soumith commented Oct 11, 2019

cc: @peterjc123

@peterjc123
Copy link

peterjc123 commented Oct 12, 2019

Have you experienced any error when using this piece of code? The suggested way is to write a CMake script like this and then mkdir build && cmake .. and cmake --build ..

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

@peterjc123 Thank you very much for your answer. First, I tried this under the Windows platform. It can be compiled and passed. But the problem is that I want to use torch. jit. script mechanism to export the model, so that it can be easily deployed in c++ end.I couldn't successfully export the model.
The code is as follows:
import torch
import warp_perspective

@torch.jit.script
def compute(x, y):
x = torch.ops.my_ops.warp_perspective.forward(y, y)
return x.matmul(y)

print(compute.graph)
print(compute(torch.randn(8, 8), torch.randn(8, 8)))
compute.save("example.pt")

The implementation code of the custom warp_perspective function is as follows(warp_perspective .cpp):
#include <torch/script.h>
#include <torch/extension.h>
torch::Tensor warp_perspective_forward(torch::Tensor image, torch::Tensor warp) {
torch::Tensor output = torch::add(image, warp);
return output.clone();
}
static auto registry =
torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward);
PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
m.def("forward", &warp_perspective_forward, "WARP_PERSPECTIVE forward (CUDA)");
}

@peterjc123
Copy link

@huang229 So the issue is on the python side, right? What error does it throw?

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

@peterjc123 I think the problem is that the warp_perspective function has not been successfully registered in the JIT mechanism.Therefore, the JIT mechanism cannot be used to export the model on the python port.I don't know what to do.

@peterjc123
Copy link

If you remove x = torch.ops.my_ops.warp_perspective.forward(y, y), can you manage to export the model?

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

x= torch::add(y, y) The model can be successfully derived.

@peterjc123
Copy link

Okay, another question, will x = torch.ops.my_ops.warp_perspective.forward(y, y) work when not wrapped up with @torch.jit.script?

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

yes.This is a direct error report.
The error is as follows:
File "D:\python-3.7.3\lib\site-packages\torch\jit_init_.py", line 1077, in _compile_function
script_fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(fn))

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

There is no save attribute without using the @torch.jit.script model.

@peterjc123
Copy link

peterjc123 commented Oct 12, 2019

I think there should be something wrong with the jit compiler code. It will be easier to deal with if you could post a full C++ stack trace.

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

@peterjc123 Successful compilation of scripts (warp_perspective.cpp).There are no errors displayed.I feel that the following code doesn't work under Windows.
static auto registry = torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward);

The errors completed when exporting the model are as follows:
Connected to pydev debugger (build 191.6605.12)
Traceback (most recent call last):
File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev\pydevd.py", line 1741, in
main()
File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev\pydevd.py", line 1735, in main
globals = debugger.run(setup['file'], None, None, is_module)
File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev\pydevd.py", line 1135, in run
pydev_imports.execfile(file, globals, locals) # execute the script
File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev_pydev_imps_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "D:/vs+opencv/tmp/script.py", line 4, in
@torch.jit.script
File "D:\python-3.7.3\lib\site-packages\torch\jit_init_.py", line 1181, in script
return _compile_function(fn=obj, qualified_name=qualified_name, _frames_up=_frames_up + 1, _rcb=rcb)
File "D:\python-3.7.3\lib\site-packages\torch\jit_init
.py", line 1077, in _compile_function
script_fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(fn))
RuntimeError:
attribute lookup is not defined on builtin:
at D:/vs+opencv/tmp/script.py:6:9
@torch.jit.script
def compute(x, y):
x = torch.ops.my_ops.warp_perspective.forward(y, y)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
return x.matmul(y)

@peterjc123
Copy link

cc @ezyang Do you know what might be the reason? I'm not so familiar with the JIT things here.

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

Here's all my code:
It contains three scripts: setup.py, cpp, script.py.
cpp:
#include <torch/script.h>
#include <torch/extension.h>
torch::Tensor warp_perspective_forward(torch::Tensor image, torch::Tensor warp) {
torch::Tensor output = torch::add(image, warp);
return output.clone();
}
static auto registry =
torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward);
PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
m.def("forward", &warp_perspective_forward, "WARP_PERSPECTIVE forward (CUDA)");
}

setup.py:
from setuptools import setup
from torch.utils.cpp_extension import BuildExtension, CppExtension

setup(
name="warp_perspective",
ext_modules=[
CppExtension(
"warp_perspective",
["example_app/warp_perspective/op.cpp"],
)
],
cmdclass={"build_ext": BuildExtension},
)

script.py:
import torch
import warp_perspective

@torch.jit.script
def compute(x, y):
x = torch.ops.my_ops.warp_perspective.forward(y, y)
return x.matmul(y)

print(compute.graph)
print(compute(torch.randn(8, 8), torch.randn(8, 8)))
compute.save("example.pt")

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 12, 2019

@peterjc123 Thank you very much for taking the time to help me.
@ezyang Hello, I want to achieve this function under the Windows system.

@ezyang
Copy link

ezyang commented Oct 14, 2019

There might be problems with how the static initialization works in Windows, which wouldn't surprise me as we don't do as heavy testing on Windows. Does it work if you move the static initialization to, e.g., a main function?

@yeyuxmf
Copy link
Author

yeyuxmf commented Oct 15, 2019

@ezyang Thank you very much for your time to help me. As I see in your official documents, using Python interface compiled by python setup.py install under windows, the JIT mechanism of torch does not support exporting models. Only after registration can the model be exported at the python end and deployed at the C + + end.
Therefore, is there any way to register the extended C + + functions into the JIT mechanism under windows?According to the tutorial you provided, the extended c++ function can only export the trained model on the python side if it is registered into the JIT mechanism.
The registration code is as follows:
static auto registry = torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward);
I don't know if it's convenient for you to provide an example of an application under Windows. I'm very grateful for that.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants