Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error after the latest update: "not enough space in the context's memory pool" #4964

Closed
1 task done
63Razor63 opened this issue Dec 17, 2023 · 4 comments
Closed
1 task done
Labels
bug Something isn't working

Comments

@63Razor63
Copy link

Describe the bug

After the latest update, I can't run Goliath 120b GGUF anymore.
Before the update, I was able to run Goliath 120b with 4096 context without any issues. After updating, I get the following error:

"ggml_new_object: not enough space in the context's memory pool (needed 2298592544, available 2298592512)"

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

Run Goliath 120b GGUF using the latest version of text-generation-webui

Screenshot

No response

Logs

ggml_new_object: not enough space in the context's memory pool (needed 2298592544, available 2298592512)

System Info

Lenovo T5
windows 11 Pro 22H2
Intel i9-12900K
NVIDIA RTX 3060
@63Razor63 63Razor63 added the bug Something isn't working label Dec 17, 2023
@TheLounger
Copy link
Contributor

That error went away for me after increasing the context to 8192.
May not be needed anymore after this 4-day old merge ggerganov/llama.cpp#4461, but I'm not sure if that's implemented in the current llama-cpp-python yet.

@63Razor63
Copy link
Author

Thank you! That looks like the issue. Now we wait.

@63Razor63
Copy link
Author

Error resolved after latest update.

@tonatiuhmira
Copy link

Hi, I'm still having this issue in a freshly installed oobabooga, using AMD (Rocm).
Loading the model with llama.cpp from current git repo loads the model without issues.
./main -m ../text-generation-webui/models/emerhyst-20b.Q4_K_M.gguf -n -1 -ins -b 512 --temp 0.7 -ngl 30 -c 8192 --repeat_penalty 1.1 --color -i -f prompts/alpaca.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants