Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

InstantID problem with webUI : v1.8.0 #2687

Closed
dhonta opened this issue Mar 7, 2024 · 4 comments
Closed

InstantID problem with webUI : v1.8.0 #2687

dhonta opened this issue Mar 7, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@dhonta
Copy link

dhonta commented Mar 7, 2024

Hi,
Please could you help me with this problem.
When i try to use InstantID i've got an error :

2024-03-07 19:20:31,962 - ControlNet - INFO - preprocessor resolution = 512
2024-03-07 19:20:32.0351307 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

*************** EP Error ***************
EP Error D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider', 'CPUExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.


2024-03-07 19:20:32.1120865 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

*** Error running process: E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py
Traceback (most recent call last):
File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\modules\scripts.py", line 784, in process
    script.process(p, *script_args)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1279, in process
    self.controlnet_hack(p)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1264, in controlnet_hack
    self.controlnet_main_entry(p)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1029, in controlnet_main_entry
    controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1029, in <listcomp>
    controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 986, in preprocess_input_image
    detected_map, is_image = self.preprocessor[unit.module](
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\utils.py", line 81, in decorated_func
    return cached_func(*args, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\utils.py", line 65, in cached_func
    return func(*args, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\global_state.py", line 37, in unified_preprocessor
    return preprocessor_modules[preprocessor_name](*args, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\processor.py", line 801, in run_model_instant_id
    self.load_model()
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\processor.py", line 741, in load_model
    self.model = FaceAnalysis(
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\app\face_analysis.py", line 31, in __init__
    model = model_zoo.get_model(onnx_file, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
    model = router.get_model(providers=providers, provider_options=provider_options)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
    session = PickableInferenceSession(self.onnx_file, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in __init__
    super().__init__(model_path, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in __init__
    raise fallback_error from e
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 427, in __init__
    self._create_inference_session(self._fallback_providers, None)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page  (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements),  make sure they're in the PATH, and that your GPU is supported.
@huchenlei huchenlei added the bug Something isn't working label Mar 12, 2024
@huchenlei
Copy link
Collaborator

This seems like a problem with your onnx-runtime setup. Does this issue start to happen after A1111 v1.8.0?

@dhonta
Copy link
Author

dhonta commented Mar 12, 2024

This seems like a problem with your onnx-runtime setup. Does this issue start to happen after A1111 v1.8.0?

Thanks for your help. Yes, I have no problem with version 1.7.0 only when i use V1.8.0.

@mnbv7758
Copy link

mnbv7758 commented Apr 6, 2024

I also have the same problem.

@huchenlei
Copy link
Collaborator

onnx runtime issue should be solved by #2761.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants