-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Windows NNVM Module export_library error #2450
Comments
Please open a new thread in https://discuss.tvm.ai/ |
I've the same problem. Is there any solution? |
I was able to export the library. I'm not 100% sure I correctly understand what happened. As far as I know, it's a bad idea to mix different compilers. And since I used the llvm target, the export fails because it uses the MSVC compiler, instead of LLVM. The same goes for the linking. So the solution was:
edit: I remembered I built tvm like this:
It might be possible, the solution above was only necessary, because I explicitly specified |
I would consider that a workaround for an unsolved problem, not a solution. With the patch above it's possible to make |
Traceback (most recent call last): dllmain.cc still not able export tvm model using lib.export_library function in windows the same works in linux. |
@myproject24 Did you run the script in the Visual Studio Command Prompt? |
It cannot export_library in windows at all because *.o cannot be compiled by cl.exe at all. |
Did you read my comment above that describes how to solve this problem? |
@mnboos I have built LLVM and TVM using Visual studio command Prompt build was success and i got all libraries, based this tvm build document https://docs.tvm.ai/install/from_source.html Building on Windows TVM support build via MSVC using cmake. The minimum required VS version is Visual Studio Community 2015 Update 3. In order to generate the VS solution file using cmake, make sure you have a recent version of cmake added to your path and then from the tvm directory: mkdir build This will generate the VS project using the MSVC 14 64 bit generator. Open the .sln file in the build directory and build with Visual Studio. In order to build with LLVM in windows, you will need to build LLVM from source. You need to run build the nnvm by running the same script under the nnvm folder.
Python dependencies
after doing all steps successfully i tried to run tvm sample programs from https://docs.tvm.ai/tutorials/frontend/from_onnx.html#sphx-glr-tutorials-frontend-from-onnx-py this works fine. |
can you fix this problem? |
It worked for me after patching TVM with the patch I showed above. After some testing I reduced it to the following:
Then apply it:
Notes:
|
thanks |
@mnboos Do you mind send a patch ? |
@tqchen Do you mean as PR? |
Yes please |
Okay done. See: #2713 |
LGTM. |
The text was updated successfully, but these errors were encountered: