From 1924708702a2513d017d44fae9989b338cbdbaaa Mon Sep 17 00:00:00 2001 From: MSZ-MGS <65172063+MSZ-MGS@users.noreply.github.com> Date: Wed, 8 Nov 2023 23:34:45 +0300 Subject: [PATCH] Update lmstudio.md to show the Prompt Formatting Option (#384) * Update lmstudio.md to show the Prompt Formatting Option * Update lmstudio.md Update the screenshot --- docs/lmstudio.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/lmstudio.md b/docs/lmstudio.md index 98f21013a3..96836527e2 100644 --- a/docs/lmstudio.md +++ b/docs/lmstudio.md @@ -6,7 +6,7 @@ If you see "Prompt Formatting" (inside LM Studio's "Server Options" panel), turn it **OFF**. Leaving it **ON** will break MemGPT. -![image](https://github.com/cpacker/MemGPT/assets/5475622/abc8ce2d-4130-4c51-8169-83e682db625d) +![image](https://github.com/MSZ-MGS/MemGPT/assets/65172063/e901e06f-a587-40e1-824f-90b60fe21d77) 1. Download [LM Studio](https://lmstudio.ai/) and the model you want to test with 2. Go to the "local inference server" tab, load the model and configure your settings (make sure to set the context length to something reasonable like 8k!)