You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
no longer able to load GPTQ models only able to use transformers.
also unable to connect to silly tavern api
Is there an existing issue for this?
I have searched the existing issues
Reproduction
update webui
Screenshot
No response
Logs
Traceback (most recent call last):
File "C:\text-generation-webui\modules\exllama_hf.py", line 14, in
from exllama.model import ExLlama, ExLlamaCache, ExLlamaConfig
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\exllama_init_.py", line 1, in
from . import cuda_ext, generator, model, tokenizer
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\exllama\cuda_ext.py", line 9, in
import exllama_ext
ImportError: DLL load failed while importing exllama_ext: The specified module could not be found.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\text-generation-webui\modules\ui_model_menu.py", line 214, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui\modules\models.py", line 90, in load_model
output = load_func_map[loader](model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui\modules\models.py", line 397, in ExLlama_HF_loader
from modules.exllama_hf import ExllamaHF
File "C:\text-generation-webui\modules\exllama_hf.py", line 21, in
from model import ExLlama, ExLlamaCache, ExLlamaConfig
File "C:\text-generation-webui\repositories\exllama\model.py", line 12, in
import cuda_ext
File "C:\text-generation-webui\repositories\exllama\cuda_ext.py", line 43, in
exllama_ext = load(
^^^^^
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1308, in load
return _jit_compile(
^^^^^^^^^^^^^
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1710, in _jit_compile
_write_ninja_file_and_build_library(
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1810, in _write_ninja_file_and_build_library
_write_ninja_file_to_build_library(
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 2199, in _write_ninja_file_to_build_library
cuda_flags = common_cflags + COMMON_NVCC_FLAGS + _get_cuda_arch_flags()
^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py", line 1980, in _get_cuda_arch_flags
arch_list[-1] += '+PTX'~~~~~~~~~^^^^
IndexError: list index out of range
System Info
nvidea 3060
intel cpu
The text was updated successfully, but these errors were encountered:
This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.
Describe the bug
no longer able to load GPTQ models only able to use transformers.
also unable to connect to silly tavern api
Is there an existing issue for this?
Reproduction
update webui
Screenshot
No response
Logs
System Info
The text was updated successfully, but these errors were encountered: