diff --git a/docs/lmstudio.md b/docs/lmstudio.md index 96836527e2..1af6d52f9e 100644 --- a/docs/lmstudio.md +++ b/docs/lmstudio.md @@ -6,7 +6,7 @@ If you see "Prompt Formatting" (inside LM Studio's "Server Options" panel), turn it **OFF**. Leaving it **ON** will break MemGPT. -![image](https://github.com/MSZ-MGS/MemGPT/assets/65172063/e901e06f-a587-40e1-824f-90b60fe21d77) +![image](https://github.com/cpacker/MemGPT/assets/5475622/74fd5e4d-a549-482d-b9f5-44b1829f41a8) 1. Download [LM Studio](https://lmstudio.ai/) and the model you want to test with 2. Go to the "local inference server" tab, load the model and configure your settings (make sure to set the context length to something reasonable like 8k!)