Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

since update i have error #5010

Closed
1 task done
lhucklen opened this issue Dec 20, 2023 · 3 comments
Closed
1 task done

since update i have error #5010

lhucklen opened this issue Dec 20, 2023 · 3 comments
Labels
bug Something isn't working stale

Comments

@lhucklen
Copy link

Describe the bug

no longer able to load GPTQ models only able to use transformers.

also unable to connect to silly tavern api

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

update webui

Screenshot

No response

Logs

Traceback (most recent call last):

File "C:\text-generation-webui\modules\exllama_hf.py", line 14, in


from exllama.model import ExLlama, ExLlamaCache, ExLlamaConfig
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\exllama_init_.py", line 1, in


from . import cuda_ext, generator, model, tokenizer
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\exllama\cuda_ext.py", line 9, in


import exllama_ext
ImportError: DLL load failed while importing exllama_ext: The specified module could not be found.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "C:\text-generation-webui\modules\ui_model_menu.py", line 214, in load_model_wrapper


shared.model, shared.tokenizer = load_model(selected_model, loader)

                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui\modules\models.py", line 90, in load_model


output = load_func_map[loader](model_name)

         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui\modules\models.py", line 397, in ExLlama_HF_loader


from modules.exllama_hf import ExllamaHF
File "C:\text-generation-webui\modules\exllama_hf.py", line 21, in


from model import ExLlama, ExLlamaCache, ExLlamaConfig
File "C:\text-generation-webui\repositories\exllama\model.py", line 12, in


import cuda_ext
File "C:\text-generation-webui\repositories\exllama\cuda_ext.py", line 43, in


exllama_ext = load(

              ^^^^^
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1308, in load


return _jit_compile(

       ^^^^^^^^^^^^^
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1710, in _jit_compile


_write_ninja_file_and_build_library(
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1810, in _write_ninja_file_and_build_library


_write_ninja_file_to_build_library(
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 2199, in _write_ninja_file_to_build_library


cuda_flags = common_cflags + COMMON_NVCC_FLAGS + _get_cuda_arch_flags()

                                                 ^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1980, in _get_cuda_arch_flags


arch_list[-1] += '+PTX'

~~~~~~~~~^^^^
IndexError: list index out of range

System Info

nvidea 3060 
intel cpu
@lhucklen lhucklen added the bug Something isn't working label Dec 20, 2023
@defecat0r
Copy link

I've got the same problem

@defecat0r
Copy link

Fixed it by following the instructions with this issue: #4993
Deleted the installer_files folder and ran the start script again. Thanks ooba!

@github-actions github-actions bot added the stale label Feb 1, 2024
Copy link

github-actions bot commented Feb 1, 2024

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

@github-actions github-actions bot closed this as completed Feb 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale
Projects
None yet
Development

No branches or pull requests

2 participants