Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Groq crash, failed to decode JSON #1450

Open
scenaristeur opened this issue Jun 13, 2024 · 0 comments
Open

Groq crash, failed to decode JSON #1450

scenaristeur opened this issue Jun 13, 2024 · 0 comments

Comments

@scenaristeur
Copy link
Contributor

scenaristeur commented Jun 13, 2024

Describe the bug
Groq does not work with function, the clean_json does not extract the functions

might be related to #1309 and #1257

Please describe your setup

  • [ X ] How did you install memgpt?
    • pip install pymemgpt and/or pip install pymemgpt-nightly
  • [ X ] Describe your setup
    • Linux Mint
    • python 3.10.12 in .venv

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
persona : sam_pov
user: basic Chad

MemGPT Config
``
[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[model]
model = llama3-8b-8192
model_endpoint = https://api.groq.com/openai
model_endpoint_type = groq
model_wrapper = llama3
context_window = 8192

[embedding]
embedding_endpoint_type = local
embedding_model = BAAI/bge-small-en-v1.5
embedding_dim = 384
embedding_chunk_size = 300

[archival_storage]
type = chroma
path = /home/smag/.memgpt/chroma

[recall_storage]
type = sqlite
path = /home/smag/.memgpt

[metadata_storage]
type = sqlite
path = /home/smag/.memgpt

[version]
memgpt_version = 0.3.17

[client]
anon_clientid = 00000000-0000-0000-0000-000000000000
``


If you're not using OpenAI, please provide additional information on your local LLM setup:

Local LLM details

If you are trying to run MemGPT with local LLMs, please provide the following information:
with tools : read_from_text_file, message_chatgpt, http_request, append_to_text_file

image

the result is

NFO:     127.0.0.1:33496 - "GET /api/agents/13190d52-f321-427f-a43c-f166225ea419/memory HTTP/1.1" 200 OK
INFO:     127.0.0.1:60924 - "POST /api/agents/13190d52-f321-427f-a43c-f166225ea419/messages HTTP/1.1" 200 OK
{"type": "user_message", "message": "can you give me the content of this page : https://spoggy-test2.solidcommunity.net/public/brains/Chateau_des_Robots/", "time": "2024-06-13 11:08:56 PM CEST+0200"}
{'id': UUID('c2460d8c-ec11-486d-add3-89a0809f49a8'), 'user_id': UUID('00000000-0000-0000-0000-000000000000'), 'agent_id': UUID('13190d52-f321-427f-a43c-f166225ea419'), 'text': '{"type": "user_message", "message": "can you give me the content of this page : https://spoggy-test2.solidcommunity.net/public/brains/Chateau_des_Robots/", "time": "2024-06-13 11:08:56 PM CEST+0200"}', 'model': 'llama3-8b-8192', 'created_at': datetime.datetime(2024, 6, 13, 21, 8, 56, 557883, tzinfo=datetime.timezone.utc), 'role': 'user', 'name': None, 'embedding': None, 'embedding_dim': None, 'embedding_model': None, 'tool_calls': None, 'tool_call_id': None}
2024-06-13T21:08:56.557883+00:00
handling asyncio exception {'message': 'Future exception was never retrieved', 'exception': LocalLLMError('Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:\n\n{\n  "function":I\'m not programmed to provide external content. - error\nFailed to decode valid MemGPT JSON from LLM output:\n=====\n\n{\n  "function":I\'m not programmed to provide external content.\n====='), 'future': <Future finished exception=LocalLLMError('Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:\n\n{\n  "function":I\'m not programmed to provide external content. - error\nFailed to decode valid MemGPT JSON from LLM output:\n=====\n\n{\n  "function":I\'m not programmed to provide external content.\n=====')>}

image

then it does not use the function http_request whrereas it is configured

image

image

With git clone and using the CLI

💭 Ah, bonjour! Bienvenue dans notre conversation. Je m'appelle Sam, et je suis ravi de vous rencontrer.
🤖 Ah, bonjour! Je suis Sam, et je suis ravi de vous rencontrer. Je vous répondrai en français si ça vous convient.

> Enter your message: peux-tu me donner le contenu de cette page https://spoggy-test2.solidcommunity.net/public/brains/Chateau_des_Robots/ ?

An exception occurred when running agent.step(): 
Traceback (most recent call last):
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/llm_chat_completion_wrappers/llama3.py", line 319, in output_to_chat_completion_response
    function_json_output = clean_json(raw_llm_output)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/json_parser.py", line 202, in clean_json
    raise LLMJSONParsingError(f"Failed to decode valid MemGPT JSON from LLM output:\n=====\n{raw_llm_output}\n=====")
memgpt.errors.LLMJSONParsingError: Failed to decode valid MemGPT JSON from LLM output:
=====

{
  "function":I apologize, it seems like there was a premature termination of the program. As Sam, I would like to clarify that I'm not programmed to access external websites or provide the content you're requesting. My capabilities are limited to conversing with you as a person. However, I can certainly entertain you with a conversation!
=====

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/chat_completion_proxy.py", line 189, in get_chat_completion
    chat_completion_result = llm_wrapper.output_to_chat_completion_response(result, first_message=first_message)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/llm_chat_completion_wrappers/llama3.py", line 321, in output_to_chat_completion_response
    raise Exception(f"Failed to decode JSON from LLM output:\n{raw_llm_output} - error\n{str(e)}")
Exception: Failed to decode JSON from LLM output:

{
  "function":I apologize, it seems like there was a premature termination of the program. As Sam, I would like to clarify that I'm not programmed to access external websites or provide the content you're requesting. My capabilities are limited to conversing with you as a person. However, I can certainly entertain you with a conversation! - error
Failed to decode valid MemGPT JSON from LLM output:
=====

{
  "function":I apologize, it seems like there was a premature termination of the program. As Sam, I would like to clarify that I'm not programmed to access external websites or provide the content you're requesting. My capabilities are limited to conversing with you as a person. However, I can certainly entertain you with a conversation!
=====

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/main.py", line 408, in run_agent_loop
    new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/main.py", line 377, in process_agent_step
    new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 823, in step
    raise e
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 751, in step
    response = self._get_ai_reply(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 456, in _get_ai_reply
    raise e
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 431, in _get_ai_reply
    response = create(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 133, in wrapper
    raise e
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 106, in wrapper
    return func(*args, **kwargs)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 324, in create
    return get_chat_completion(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/chat_completion_proxy.py", line 194, in get_chat_completion
    raise LocalLLMError(f"Failed to parse JSON from local LLM response - error: {str(e)}")
memgpt.errors.LocalLLMError: Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:

{
  "function":I apologize, it seems like there was a premature termination of the program. As Sam, I would like to clarify that I'm not programmed to access external websites or provide the content you're requesting. My capabilities are limited to conversing with you as a person. However, I can certainly entertain you with a conversation! - error
Failed to decode valid MemGPT JSON from LLM output:
=====

{
  "function":I apologize, it seems like there was a premature termination of the program. As Sam, I would like to clarify that I'm not programmed to access external websites or provide the content you're requesting. My capabilities are limited to conversing with you as a person. However, I can certainly entertain you with a conversation!
=====
? Retry agent.step()? (Y/n)

removing error handling in clean_json

Traceback (most recent call last):
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/chat_completion_proxy.py", line 189, in get_chat_completion
    chat_completion_result = llm_wrapper.output_to_chat_completion_response(result, first_message=first_message)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/llm_chat_completion_wrappers/llama3.py", line 321, in output_to_chat_completion_response
    raise Exception(f"Failed to decode JSON from LLM output:\n{raw_llm_output} - error\n{str(e)}")
Exception: Failed to decode JSON from LLM output:

{
  "function": "send_message",{
  "function": "send_message",
  "params": {
    "inner_thoughts": "Initialization complete. Let's begin our conversation.",
    "message": "Nice to meet you, Chad!"
  }
} - error
Expecting property name enclosed in double quotes: line 3 column 30 (char 32)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/main.py", line 408, in run_agent_loop
    new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/main.py", line 377, in process_agent_step
    new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 823, in step
    raise e
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 738, in step
    response = self._get_ai_reply(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 456, in _get_ai_reply
    raise e
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/agent.py", line 431, in _get_ai_reply
    response = create(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 133, in wrapper
    raise e
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 106, in wrapper
    return func(*args, **kwargs)
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 324, in create
    return get_chat_completion(
  File "/home/smag/dev/quest/bot/.venv/lib/python3.10/site-packages/memgpt/local_llm/chat_completion_proxy.py", line 194, in get_chat_completion
    raise LocalLLMError(f"Failed to parse JSON from local LLM response - error: {str(e)}")
memgpt.errors.LocalLLMError: Failed to parse JSON from local LLM response - error: Failed to decode JSON from LLM output:

{
  "function": "send_message",{
  "function": "send_message",
  "params": {
    "inner_thoughts": "Initialization complete. Let's begin our conversation.",
    "message": "Nice to meet you, Chad!"
  }
} - error
Expecting property name enclosed in double quotes: line 3 column 30 (char 32)
? Retry agent.step()? (Y/n)                                                                   
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: To triage
Development

No branches or pull requests

1 participant