diff --git a/docs/lmstudio.md b/docs/lmstudio.md index 98f21013a3..96836527e2 100644 --- a/docs/lmstudio.md +++ b/docs/lmstudio.md @@ -6,7 +6,7 @@ If you see "Prompt Formatting" (inside LM Studio's "Server Options" panel), turn it **OFF**. Leaving it **ON** will break MemGPT. -![image](https://github.com/cpacker/MemGPT/assets/5475622/abc8ce2d-4130-4c51-8169-83e682db625d) +![image](https://github.com/MSZ-MGS/MemGPT/assets/65172063/e901e06f-a587-40e1-824f-90b60fe21d77) 1. Download [LM Studio](https://lmstudio.ai/) and the model you want to test with 2. Go to the "local inference server" tab, load the model and configure your settings (make sure to set the context length to something reasonable like 8k!)