Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 [Bug] Windows native library support (works!) #2371

Closed
Tracked by #2495
markivfb opened this issue Oct 8, 2023 · 8 comments · Fixed by #2806
Closed
Tracked by #2495

🐛 [Bug] Windows native library support (works!) #2371

markivfb opened this issue Oct 8, 2023 · 8 comments · Fixed by #2806
Assignees
Labels
bug Something isn't working

Comments

@markivfb
Copy link

markivfb commented Oct 8, 2023

Bug Description

Hi, based on some past posts like this and this and this, I was able to get a bazel build working for windows for the c++ library, with both model compilation and inference, by building torch_tensorrt.dll, which I believe includes both the runtime and the compilation plugins.

I am wondering what it would take for official windows support. I've attached a full patch, I think the declspec may not be needed.

Once this builds, all one has to do to use torch_tensorrt in their project is, link to torch_tensorrt.dll.if.lib and ensure torch_tensorrt.dll is loaded up with LoadLibrary as mentioned in other issue reports.

    HMODULE hLib = LoadLibrary(TEXT("torch_tensorrt"));
    if (hLib == NULL) {
        std::cerr << "Library torch_tensorrt.dll not found" << std::endl;
        exit(1);
    }

More info about exact versions of libraries used:
cuda_11.8.0_522.06_windows.exe
TensorRT-8.6.1.6.Windows10.x86_64.cuda-11.8
libtorch-win-shared-with-deps-2.0.1+cu118
cudnn-windows-x86_64-8.8.0.121_cuda11-archive

Build command:
bazel-6.3.2-windows-x86_64.exe build //:libtorchtrt --compilation_mode opt

Environment

Build information about Torch-TensorRT can be found by turning on debug messages

  • Torch-TensorRT Version (e.g. 1.0.0): 8.6.1.6
  • PyTorch Version (e.g. 1.0): 2.0.1
  • CPU Architecture:
  • OS (e.g., Linux): Windows
  • How you installed PyTorch (conda, pip, libtorch, source): libtorch, source
  • Build command you used (if compiling from source): bazel
  • Are you using local sources or building from archives: latest github?
  • Python version:
  • CUDA version: 11.8
  • GPU models and configuration: 3080Ti
  • Any other relevant information:

Additional context

0001-WINDOWS-SUPPORT.patch

@markivfb markivfb added the bug Something isn't working label Oct 8, 2023
@jensdraht1999
Copy link

@markivfb Can you use this with Pytorch? Or just native C/C++?

If yes, can you tell me, what you did with the dlls and where did you put them?

@Justin62628
Copy link

This script works on my cuda 11.7 too with torch 2.0.1. torch_tensorrt.dll is successfully built. Thanks for your marvelous work.

@jensdraht1999
Copy link

@Justin62628 I have build torch_tensorrt.dll successfully too, but something more complicated. However: What do I actually do with these dll files for Pytorch?

2

@Justin62628
Copy link

Justin62628 commented Oct 21, 2023

@Justin62628 I have build torch_tensorrt.dll successfully too, but something more complicated. However: What do I actually do with these dll files for Pytorch?

2

so far I can only work with these dll files with libtorch.
Normally we should compile wheels of torch_tensorrt for python first, which means you are to exec python setup.py install from this repo to get the _C.pyd extension for torch_tensorrt(the setup script will automatically get all those done for you, which installs torch_tensorrt python api to your site-pakcages), then you are able to import torch_tensorrt in your python project,
I have managed to compile torch_tensorrt _C.pyd with a modified compilation script bazel build //:libtorchtrt --compilation_mode opt --features=windows_export_all_symbols, but there seems to be some linking and symbols exporting mistakes, where I encounters external dlls not found error when trying to import torch_tensorrt into python project like yours

@jensdraht1999
Copy link

@Justin62628 Could you share your wheel file for Python, so I could install this? You can upload as a image file here and then I can rename the extension back to .py and install it.

@phineas-pta
Copy link

for me i also have to edit file .bazelrc: change -std=c++17 to /std:c++17

@jensdraht1999
Copy link

for me i also have to edit file .bazelrc: change -std=c++17 to /std:c++17

Did you manage to somehow work it with pytorch with python?

@phineas-pta
Copy link

@jensdraht1999 wheel built successfully but unusable (dll linking error like #856)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants