Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] rocm.py fails to correctly find LLVM without PATH being set #316

Closed
TNT3530 opened this issue Apr 15, 2024 · 1 comment
Closed

[Bug] rocm.py fails to correctly find LLVM without PATH being set #316

TNT3530 opened this issue Apr 15, 2024 · 1 comment

Comments

@TNT3530
Copy link

TNT3530 commented Apr 15, 2024

Expected behavior

TVM should find the LLVM ld.lld file

Actual behavior

When running mlc_llm chat with JIT compiling on, TVM fails to find the LLVM installation, throwing
RuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']

Environment

Testing in an MLC docker container with fresh installs of nightly

Steps to reproduce

run mlc_llm chat HF://<model>, it will download the model, compile it, then crash when saving the .so file

Triage

Line 55 of https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/rocm.py incorrectly forgets to add ld.lld (or whatever it finds in the lines above) to the /opt/rocm/llvm/bin path, which then returns None since os.path.isfile in https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/utils.py#L253 returns False when pointed at directories.

@tqchen
Copy link
Contributor

tqchen commented Apr 24, 2024

should be fixed upstream

@tqchen tqchen closed this as completed Apr 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants