Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more stop tokens #288

Merged
merged 1 commit into from
Nov 3, 2023
Merged

Add more stop tokens #288

merged 1 commit into from
Nov 3, 2023

Conversation

cpacker
Copy link
Collaborator

@cpacker cpacker commented Nov 3, 2023

Patch specific parsing error:

memgpt.errors.LocalLLMError: Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:
{
"function": "send_message",
"params": {
    "inner_thought": "Welcome to AI world. Chatting away.",
    "message": "Hello! How may I assist you?"
}
FUNCTION RETURN {"status": "OK", "message": null", "time": "2023-13-13 16:26 AM PDT-700"} - error
JSONDecodeError.__init__() missing 2 required positional arguments: 'doc' and 'pos'

@cpacker cpacker merged commit 6e0b32b into main Nov 3, 2023
2 checks passed
@cpacker cpacker deleted the more-stop-tokens branch November 3, 2023 19:32
@EricFedrowisch
Copy link

Error still occurring with project source pulled Nov.6.
Using Local LLM via LM Studio backend.
Model: dolphin-2.2.1-mistral-7b.Q5_K_M.gguf
Here is a console log:

`💭 Bootup sequence complete. Persona activated. Testing messaging functionality.
🤖 More human than human is our motto.
🧑 {'type': 'login', 'last_login': 'Never (first login)', 'time': '2023-11-06 10:05:43 AM '}
Hit enter to begin (will request first MemGPT message)

Warning: no wrapper specified for local LLM, using the default wrapper (you can remove this warning by specifying the
wrapper with --model)
⠹ Thinking...
step() failed
user_message = None
error = Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:
{
"Hello, it's great to have you here. I am eager to understand and connect with you better as I learn more about your world.
You can consider me a test subject in this process. Let's dive into our discussion."} - error
JSONDecodeError.init() missing 2 required positional arguments: 'doc' and 'pos'
Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:
{
"Hello, it's great to have you here. I am eager to understand and connect with you better as I learn more about your world.
You can consider me a test subject in this process. Let's dive into our discussion."} - error
JSONDecodeError.init() missing 2 required positional arguments: 'doc' and 'pos'
An exception ocurred when running agent.step():
Traceback (most recent call last):
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/json_parser.py", line 53, in clean_json
data = json.loads(raw_llm_output)
File "/usr/lib/python3.10/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.10/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting ':' delimiter: line 2 column 209 (char 210)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/json_parser.py", line 56, in clean_json
data = json.loads(raw_llm_output + "}")
File "/usr/lib/python3.10/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.10/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting ':' delimiter: line 2 column 209 (char 210)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/json_parser.py", line 18, in extract_first_json
return json.loads(string[start_index : i + 1])
File "/usr/lib/python3.10/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.10/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting ':' delimiter: line 2 column 209 (char 210)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/llm_chat_completion_wrappers/airoboros.py", line 397, in output_to_chat_completion_response
function_json_output = clean_json(raw_llm_output)
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/json_parser.py", line 59, in clean_json
data = extract_first_json(raw_llm_output + "}")
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/json_parser.py", line 20, in extract_first_json
raise json.JSONDecodeError(f"Matched closing bracket, but decode failed with error: {str(e)}")
TypeError: JSONDecodeError.init() missing 2 required positional arguments: 'doc' and 'pos'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/chat_completion_proxy.py", line 112, in get_chat_completion
chat_completion_result = llm_wrapper.output_to_chat_completion_response(result)
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/llm_chat_completion_wrappers/airoboros.py", line 399, in output_to_chat_completion_response
raise Exception(f"Failed to decode JSON from LLM output:\n{raw_llm_output} - error\n{str(e)}")
Exception: Failed to decode JSON from LLM output:
{
"Hello, it's great to have you here. I am eager to understand and connect with you better as I learn more about your world. You can consider me a test subject in this process. Let's dive into our discussion."} - error
JSONDecodeError.init() missing 2 required positional arguments: 'doc' and 'pos'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/leaf/@AI/MemGPT/memgpt/main.py", line 608, in run_agent_loop
new_messages, user_message, skip_next_user_input = await process_agent_step(user_message, no_verify)
File "/home/leaf/@AI/MemGPT/memgpt/main.py", line 584, in process_agent_step
new_messages, heartbeat_request, function_failed, token_warning = await memgpt_agent.step(
File "/home/leaf/@AI/MemGPT/memgpt/agent.py", line 1115, in step
raise e
File "/home/leaf/@AI/MemGPT/memgpt/agent.py", line 1051, in step
response = await get_ai_reply_async(model=self.model, message_sequence=input_message_sequence, functions=self.functions) File "/home/leaf/@AI/MemGPT/memgpt/agent.py", line 163, in get_ai_reply_async
raise e
File "/home/leaf/@AI/MemGPT/memgpt/agent.py", line 144, in get_ai_reply_async
response = await acreate(
File "/home/leaf/@AI/MemGPT/memgpt/openai_tools.py", line 115, in wrapper
raise e
File "/home/leaf/@AI/MemGPT/memgpt/openai_tools.py", line 95, in wrapper
return await func(*args, **kwargs)
File "/home/leaf/@AI/MemGPT/memgpt/openai_tools.py", line 124, in acompletions_with_backoff
return get_chat_completion(**kwargs)
File "/home/leaf/@AI/MemGPT/memgpt/local_llm/chat_completion_proxy.py", line 116, in get_chat_completion
raise LocalLLMError(f"Failed to parse JSON from local LLM response - error: {str(e)}")
memgpt.errors.LocalLLMError: Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:
{
"Hello, it's great to have you here. I am eager to understand and connect with you better as I learn more about your world. You can consider me a test subject in this process. Let's dive into our discussion."} - error
JSONDecodeError.init() missing 2 required positional arguments: 'doc' and 'pos'
? Retry agent.step()? (Y/n)`

@ozfrenz
Copy link

ozfrenz commented Nov 6, 2023

@cpacker Same error "JSONDecodeError.init() missing 2 required positional arguments: 'doc' and 'pos'" like @EricFedrowisch.

@cpacker
Copy link
Collaborator Author

cpacker commented Nov 6, 2023

@ozfrenz can you post the specific text (what comes after "Failed to parse JSON...")?

mattzh72 pushed a commit that referenced this pull request Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants