You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've ended up making a dict with hardcoded keys
and passing the values from the object to it. But it would be better do make it serializable.
If you want I can try to fix it and open a PR.
Yes! The key settings.LITELLM_HOST has the litellm proxy URL, but the error is when sending the request, that is, in the python library (acompletion function).
What happened?
Hi everyone!
I don't know if I'm missing something, but when passing a LangFuse prompt retrieved like (as per documentation - https://langfuse.com/docs/prompts/get-started#use-prompt):
to the acompletion method (as per documentation: https://litellm.vercel.app/docs/observability/langfuse_integration#generation-specific-parameters) like:
I am getting a "Object of type TextPromptClient is not JSON serializable" error.
Indeed if we call json.dumps(prompt) it returns the error.
How can I make it work?
Relevant log output
No response
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: