Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Langfuse prompt Object of type TextPromptClient is not JSON serializable #4140

Closed
ivanviragine opened this issue Jun 12, 2024 · 5 comments · Fixed by #4275
Closed
Assignees
Labels
bug Something isn't working

Comments

@ivanviragine
Copy link

ivanviragine commented Jun 12, 2024

What happened?

Hi everyone!
I don't know if I'm missing something, but when passing a LangFuse prompt retrieved like (as per documentation - https://langfuse.com/docs/prompts/get-started#use-prompt):

def get_langfuse_prompt(name: str) -> TextPromptClient:
    try:
        return langfuse.get_prompt(name)
    except Exception as e:
        logger.exception(e)

        raise Exception(f"Error getting prompt: {e}")

to the acompletion method (as per documentation: https://litellm.vercel.app/docs/observability/langfuse_integration#generation-specific-parameters) like:

                response = await acompletion(
                    api_key=settings.LITELLM_MASTER_KEY,
                    api_base=settings.LITELLM_HOST,
                    model=model,
                    temperature=self.TEMPERATURE,
                    messages=messages,
                    timeout=settings.OPENAI_API_TIMEOUT_SECONDS,
                    extra_body={
                        "metadata": {
                            "tags": [model, self.AIProvider.name, ai_model_id],
                            "prompt": prompt
                        }
                    },
                )

I am getting a "Object of type TextPromptClient is not JSON serializable" error.
Indeed if we call json.dumps(prompt) it returns the error.

How can I make it work?

Relevant log output

No response

Twitter / LinkedIn details

No response

@ivanviragine ivanviragine added the bug Something isn't working label Jun 12, 2024
@ishaan-jaff
Copy link
Contributor

taking a look @ivanviragine

@ivanviragine
Copy link
Author

Thank you @ishaan-jaff !

I've ended up making a dict with hardcoded keys
and passing the values from the object to it. But it would be better do make it serializable.
If you want I can try to fix it and open a PR.

@ishaan-jaff
Copy link
Contributor

@ivanviragine you're making this request to the litellm proxy server correct ?

@ivanviragine
Copy link
Author

Yes! The key settings.LITELLM_HOST has the litellm proxy URL, but the error is when sending the request, that is, in the python library (acompletion function).

@ishaan-jaff
Copy link
Contributor

fixed here @ivanviragine
#4275

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants