Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem when setting CPU memory limit #3763

Closed
1 task done
BoomBoomPY opened this issue Aug 30, 2023 · 2 comments · Fixed by #4597
Closed
1 task done

Problem when setting CPU memory limit #3763

BoomBoomPY opened this issue Aug 30, 2023 · 2 comments · Fixed by #4597
Labels
bug Something isn't working stale

Comments

@BoomBoomPY
Copy link

Describe the bug

When trying to set a CPU memory limit, I get this error:

line 85, in convert_file_size_to_int

raise ValueError(err_msg)
ValueError: size 64000MiBGiB is not in a valid format. Use an integer for bytes, or a string with an unit (like ‘5.0GB’).

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

Change the CPU memory and try to load a model

Screenshot

image

Logs

Traceback (most recent call last):

File “/home/chungus/Downloads/LocalGPT/installer_files/env/lib/python3.10/site-packages/accelerate/utils/modeling.py”, line 70, in convert_file_size_to_int

mem_size = int(float(size[:-3]) * (2**30))
ValueError: could not convert string to float: ‘64000MiB’

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File “/home/chungus/Downloads/LocalGPT/text-generation-webui/modules/ui_model_menu.py”, line 195, in load_model_wrapper

shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File “/home/chungus/Downloads/LocalGPT/text-generation-webui/modules/models.py”, line 79, in load_model

output = load_func_map[loader](model_name)
File “/home/chungus/Downloads/LocalGPT/text-generation-webui/modules/models.py”, line 320, in AutoGPTQ_loader

return modules.AutoGPTQ_loader.load_quantized(model_name)
File “/home/chungus/Downloads/LocalGPT/text-generation-webui/modules/AutoGPTQ_loader.py”, line 57, in load_quantized

model = AutoGPTQForCausalLM.from_quantized(path_to_model, **params)
File “/home/chungus/Downloads/LocalGPT/installer_files/env/lib/python3.10/site-packages/auto_gptq/modeling/auto.py”, line 108, in from_quantized

return quant_func(
File “/home/chungus/Downloads/LocalGPT/installer_files/env/lib/python3.10/site-packages/auto_gptq/modeling/_base.py”, line 859, in from_quantized

max_memory = accelerate.utils.get_balanced_memory(
File “/home/chungus/Downloads/LocalGPT/installer_files/env/lib/python3.10/site-packages/accelerate/utils/modeling.py”, line 767, in get_balanced_memory

max_memory = get_max_memory(max_memory)
File “/home/chungus/Downloads/LocalGPT/installer_files/env/lib/python3.10/site-packages/accelerate/utils/modeling.py”, line 654, in get_max_memory

max_memory[key] = convert_file_size_to_int(max_memory[key])
File “/home/chungus/Downloads/LocalGPT/installer_files/env/lib/python3.10/site-packages/accelerate/utils/modeling.py”, line 85, in convert_file_size_to_int

raise ValueError(err_msg)
ValueError: size 64000MiBGiB is not in a valid format. Use an integer for bytes, or a string with an unit (like ‘5.0GB’).

System Info

Most recent Ubuntu, RTX 4070 (gpu0) and RTX 3070 (gpu1)
@BoomBoomPY BoomBoomPY added the bug Something isn't working label Aug 30, 2023
@8skull
Copy link

8skull commented Aug 31, 2023

*I'm in the same boat with the same problem.

I just avoided using CPU regarding sliders

@github-actions github-actions bot added the stale label Oct 12, 2023
@github-actions
Copy link

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

null-dev added a commit to null-dev/text-generation-webui that referenced this issue Nov 15, 2023
get_max_memory_dict() was not properly formatting shared.args.cpu_memory
oobabooga added a commit that referenced this issue Nov 15, 2023
get_max_memory_dict() was not properly formatting shared.args.cpu_memory

Co-authored-by: oobabooga <[email protected]>
oobabooga added a commit that referenced this issue Nov 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants