Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with loading models #104

Open
CraterCrater opened this issue Nov 18, 2024 · 0 comments
Open

Issue with loading models #104

CraterCrater opened this issue Nov 18, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@CraterCrater
Copy link

Describe the bug

Not able to load any models, just keeps saying loading.

To Reproduce

Steps to reproduce the behavior:

  1. Open AI Playground
  2. Create
  3. input prompt
  4. Click Generate

Expected behavior

I expected that the model would load and I would be able to generate a picture

Screenshots

C:\Users\rober\AppData\Local\Programs\AI Playground>
[electron-backend]: #1 try to start python API
[ai-backend]: C:\Users\rober\AppData\Local\Programs\AI Playground\resources\env\Lib\site-packages\torchvision\io\image.py:13: UserWarning: Failed to load image Python extension: 'Could not find module 'C:\Users\rober\AppData\Local\Programs\AI Playground\resources\env\Lib\site-packages\torchvision\image.pyd' (or one of its dependencies). Try using the full path with constructor syntax.'If you don't plan on using image functionality from torchvision.io, you can ignore this warning. Otherwise, there might be something wrong with your environment. Did you have libjpeg or libpng installed before building torchvision from source?
warn(

[ai-backend]: C:\Users\rober\AppData\Local\Programs\AI Playground\resources\env\Lib\site-packages\transformers\deepspeed.py:23: FutureWarning: transformers.deepspeed module is deprecated and will be removed in a future version. Please import deepspeed modules directly from transformers.integrations
warnings.warn(

[ai-backend]: 2024-11-17 19:16:58,056 - INFO - intel_extension_for_pytorch auto imported

[ai-backend]: Set ONEAPI_DEVICE_SELECTOR=*:0
workarounds applied

  • Serving Flask app 'web_api'
  • Debug mode: off

[ai-backend]: 2024-11-17 19:17:03,130 - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.

[ai-backend]: 2024-11-17 19:17:03,384 - INFO - 127.0.0.1 - - [17/Nov/2024 19:17:03] "POST /api/init HTTP/1.1" 200 -

[ai-backend]: 2024-11-17 19:17:03,389 - INFO - 127.0.0.1 - - [17/Nov/2024 19:17:03] "POST /api/getGraphics HTTP/1.1" 200 -

[ai-backend]: 2024-11-17 19:17:38,630 - INFO - 127.0.0.1 - - [17/Nov/2024 19:17:38] "POST /api/checkModelExist HTTP/1.1" 200 -

[ai-backend]: 2024-11-17 19:17:38,747 - INFO - 127.0.0.1 - - [17/Nov/2024 19:17:38] "POST /api/sd/generate HTTP/1.1" 200 -

Environment (please complete the following information):

  • OS: Windows 11
  • GPU: Intel Arc A770 16G
  • CPU: i9 10th
  • Version: c 1.22.1-beta

Additional context

I was using it successfully yesterday without issues (other than LLMs manually not working) and today it isn't working at all.

@CraterCrater CraterCrater added the bug Something isn't working label Nov 18, 2024
@CraterCrater CraterCrater changed the title Issue with Issue with loading models Nov 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant