Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Windows NNVM Module export_library error #2450

Closed
chinakook opened this issue Jan 17, 2019 · 18 comments
Closed

Windows NNVM Module export_library error #2450

chinakook opened this issue Jan 17, 2019 · 18 comments

Comments

@chinakook
Copy link
Contributor

Traceback (most recent call last):
  File "deploy_ssd_tvm.py", line 40, in <module>
    lib.export_library("./deployed_model/deploy_lib.dll", fcompile=False)
  File "C:\Users\*****\AppData\Roaming\Python\Python37\site-packages\tvm-0.5.dev0-py3.7-win-amd64.egg\tvm\module.py", line 128, in export_library
    fcompile(file_name, files, **kwargs)
  File "C:\Users\*****\AppData\Roaming\Python\Python37\site-packages\tvm-0.5.dev0-py3.7-win-amd64.egg\tvm\contrib\cc.py", line 35, in create_shared
    _windows_shared(output, objects, options)
  File "C:\Users\*****\AppData\Roaming\Python\Python37\site-packages\tvm-0.5.dev0-py3.7-win-amd64.egg\tvm\contrib\cc.py", line 120, in _windows_shared
    raise RuntimeError(msg)
RuntimeError: Compilation error:
Microsoft (R) Incremental Linker Version 14.16.27026.1
Copyright (C) Microsoft Corporation.  All rights reserved.

lib.o : fatal error LNK1143: invalid or corrupt file: no symbol for COMDAT section 0x5
@tqchen
Copy link
Member

tqchen commented Jan 17, 2019

Please open a new thread in https://discuss.tvm.ai/

@tqchen tqchen closed this as completed Jan 17, 2019
@mnboos
Copy link
Contributor

mnboos commented Jan 29, 2019

I've the same problem. Is there any solution?

@mnboos
Copy link
Contributor

mnboos commented Jan 29, 2019

I was able to export the library.

I'm not 100% sure I correctly understand what happened. As far as I know, it's a bad idea to mix different compilers. And since I used the llvm target, the export fails because it uses the MSVC compiler, instead of LLVM. The same goes for the linking.

So the solution was:

  1. Update tvm/contrib/cc.py as follows:
--- a/cc.py
+++ b/cc_updated.py
@@ -59,7 +59,7 @@ def _linux_shared(output, objects, options, cc="g++"):
 
 
 def _windows_shared(output, objects, options):
-    cl_cmd = ["cl"]
+    cl_cmd = [r"C:\Program Files\LLVM\bin\clang-cl"]
     cl_cmd += ["-c"]
     if isinstance(objects, str):
         objects = [objects]
@@ -79,7 +79,7 @@ BOOL APIENTRY DllMain( HMODULE hModule,\
     cl_cmd += [dllmain_path]
 
     temp_path = dllmain_path.replace("dllmain.cc", "")
-    cl_cmd += ["-Fo:" + temp_path]
+    cl_cmd += ["-Fo" + temp_path]
     try:
         proc = subprocess.Popen(
             cl_cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
@@ -91,7 +91,7 @@ BOOL APIENTRY DllMain( HMODULE hModule,\
         msg = "Compilation error:\n"
         msg += py_str(out)
         raise RuntimeError(msg)
-    link_cmd = ["link"]
+    link_cmd = [r"C:\Program Files\LLVM\bin\lld-link"]
     link_cmd += ["-dll", "-FORCE:MULTIPLE"]
 
     for obj in objects:
  1. Run in Visual Studio Command Prompt to make sure all the required libraries are found.

edit: I remembered I built tvm like this:

cmake -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ -Thost=x64 -G "Visual Studio 15 2017 Win64" -DCMAKE_BUILD_TYPE=Release -DCMAKE_CONFIGURATION_TYPES="Release" -DUSE_LLVM=C:/src/llvm-7.0.1.src/build/Release/bin/llvm-config.exe ..

It might be possible, the solution above was only necessary, because I explicitly specified clang and clang++ when building TVM.

@chinakook
Copy link
Contributor Author

@mnboos
Copy link
Contributor

mnboos commented Jan 30, 2019

I would consider that a workaround for an unsolved problem, not a solution. With the patch above it's possible to make export_library work on windows. Maybe an expert could comment my approach @tqchen ?

@notebookdata
Copy link

Traceback (most recent call last):
File "D:\WorkSapmles\tvm_samples\tvm_samples\tvm_samples.py", line 108, in
lib.export_library("D:\tvm\tvm\nnvm\python\deploy_lib.dll", fcompile=False)
File "C:\anaconda3\lib\site-packages\tvm-0.5.dev0-py3.5-win-amd64.egg\tvm\module.py", line 128, in export_library
fcompile(file_name, files, **kwargs)
File "C:\anaconda3\lib\site-packages\tvm-0.5.dev0-py3.5-win-amd64.egg\tvm\contrib\cc.py", line 35, in create_shared
_windows_shared(output, objects, options)
File "C:\anaconda3\lib\site-packages\tvm-0.5.dev0-py3.5-win-amd64.egg\tvm\contrib\cc.py", line 93, in _windows_shared
raise RuntimeError(msg)
RuntimeError: Compilation error:
Microsoft (R) C/C++ Optimizing Compiler Version 19.00.24215.1 for x86
Copyright (C) Microsoft Corporation. All rights reserved.

dllmain.cc
C:\Users\700001~1\AppData\Local\Temp\1\tmp7kgj77e3\dllmain.cc(1): fatal error C1034: windows.h: no include path set

still not able export tvm model using lib.export_library function in windows the same works in linux.

@mnboos
Copy link
Contributor

mnboos commented Jan 30, 2019

@myproject24 Did you run the script in the Visual Studio Command Prompt?

@chinakook
Copy link
Contributor Author

It cannot export_library in windows at all because *.o cannot be compiled by cl.exe at all.

@mnboos
Copy link
Contributor

mnboos commented Jan 30, 2019

Did you read my comment above that describes how to solve this problem?

@notebookdata
Copy link

@mnboos I have built LLVM and TVM using Visual studio command Prompt build was success and i got all libraries, based this tvm build document https://docs.tvm.ai/install/from_source.html

Building on Windows

TVM support build via MSVC using cmake. The minimum required VS version is Visual Studio Community 2015 Update 3. In order to generate the VS solution file using cmake, make sure you have a recent version of cmake added to your path and then from the tvm directory:

mkdir build
cd build
cmake -G "Visual Studio 14 2015 Win64" -DCMAKE_BUILD_TYPE=Release -DCMAKE_CONFIGURATION_TYPES="Release" ..

This will generate the VS project using the MSVC 14 64 bit generator. Open the .sln file in the build directory and build with Visual Studio. In order to build with LLVM in windows, you will need to build LLVM from source. You need to run build the nnvm by running the same script under the nnvm folder.

Install tvm python bindings by setup.py:

# install tvm package for the current user
# NOTE: if you installed python via homebrew, --user is not needed during installaiton
#       it will be automatically installed to your user directory.
#       providing --user flag may trigger error during installation in such case.
export MACOSX_DEPLOYMENT_TARGET=10.9  # This is required for mac to avoid symbol conflicts with libstdc++
cd python; python setup.py install --user; cd ..
cd topi/python; python setup.py install --user; cd ../..
cd nnvm/python; python setup.py install --user; cd ../..

Python dependencies

    Necessary dependencies:

pip install --user numpy decorator

    If you want to use RPC Tracker

pip install --user tornado

    If you want to use auto-tuning module

pip install --user tornado psutil xgboost

after doing all steps successfully i tried to run tvm sample programs from https://docs.tvm.ai/tutorials/frontend/from_onnx.html#sphx-glr-tutorials-frontend-from-onnx-py this works fine.
But model converstion is problem that tvm tutorial https://docs.tvm.ai/tutorials/nnvm_quick_start.html#sphx-glr-tutorials-nnvm-quick-start-py Save and Load Compiled Module is failing.

@zaidao2023
Copy link

can you fix this problem?

@mnboos
Copy link
Contributor

mnboos commented Feb 22, 2019

It worked for me after patching TVM with the patch I showed above. After some testing I reduced it to the following:

diff --git a/python/tvm/contrib/cc.py b/python/tvm/contrib/cc.py
index 0ffa6c42..906bf4cc 100644
--- a/python/tvm/contrib/cc.py
+++ b/python/tvm/contrib/cc.py
@@ -91,7 +91,7 @@ BOOL APIENTRY DllMain( HMODULE hModule,\
         msg = "Compilation error:\n"
         msg += py_str(out)
         raise RuntimeError(msg)
-    link_cmd = ["link"]
+    link_cmd = ["lld-link"]
     link_cmd += ["-dll", "-FORCE:MULTIPLE"]

     for obj in objects:

Then apply it:

set PATH=%path%;C:\Program Files\Git\bin
bash.exe -c 'patch -p1 ~/AppData/Roaming/Python/Python37/site-packages/tvm-0.5.dev0-py3.7-win-amd64.egg/tvm/contrib/cc.py cc.py.patch'

Notes:

  1. You must build LLD with LLVM
  2. The directory of the built binaries (like lld-link.exe) must added to the path variables (or change to patch to include the full path to lld-link)

@zaidao2023
Copy link

thanks

@tqchen
Copy link
Member

tqchen commented Feb 22, 2019

@mnboos Do you mind send a patch ?

@mnboos
Copy link
Contributor

mnboos commented Feb 25, 2019

@tqchen Do you mean as PR?

@tqchen
Copy link
Member

tqchen commented Feb 25, 2019

Yes please

@mnboos mnboos mentioned this issue Mar 1, 2019
@mnboos
Copy link
Contributor

mnboos commented Mar 1, 2019

Okay done.

See: #2713

@chinakook
Copy link
Contributor Author

LGTM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants