-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Logfire log not wrapped in request span #4202
Comments
@elisalimli any chance you can take a look at this one / help us with this ? |
is logfire otel compatible? wondering if we can just migrate to our native otel integration - https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput-in-opentelemetry-format
it looks like it's otel compatible, but i can't find docs on logging to it via just otel |
@ishaan-jaff Seems like you are starting a new thread for success_handler. https://github.com/BerriAI/litellm/blob/main/litellm/utils.py#L3867 Logfire groups the spans using the context managers, I am suspicious if the cause of this issue is the threading. |
Replacing starting a new thread with calling the success_handler directly solved the issue. It looks like the started thread doesn't have the context managers. |
Would it solve the issue of the non-nested spans? |
Fix here: #4254 |
This should be fixed on 1.40.16 @ionflow
https://docs.litellm.ai/docs/observability/logfire_integration |
What happened?
When making a request using LiteLLM in a fastapi logfire project logs are not wrapped in the span where the request is logged.
![Screenshot 2024-06-14 at 1 12 40 PM](https://private-user-images.githubusercontent.com/4053903/339912587-a1b7236f-a524-4e8c-a189-a5a9abecc17d.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk0MDU5NTAsIm5iZiI6MTcxOTQwNTY1MCwicGF0aCI6Ii80MDUzOTAzLzMzOTkxMjU4Ny1hMWI3MjM2Zi1hNTI0LTRlOGMtYTE4OS1hNWE5YWJlY2MxN2QucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI0MDYyNiUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNDA2MjZUMTI0MDUwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9OWY0NTU2N2RlYTZjMDJhOWJkNDE1MGYwOTZjM2U3NmI2MzY2NzhmZDQ5ZGI3MWQwZDlkY2JlNGUyZTdiZDAzZCZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QmYWN0b3JfaWQ9MCZrZXlfaWQ9MCZyZXBvX2lkPTAifQ.svGNkBUUJUoJU6XWwiAlMxN1fKAQfFcqHLDkFdv2dOA)
When you collapse the span you can see it is outside of it.
![Screenshot 2024-06-14 at 2 42 12 PM](https://private-user-images.githubusercontent.com/4053903/339912597-91db920a-c625-402d-a861-2904ff71c4f2.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk0MDU5NTAsIm5iZiI6MTcxOTQwNTY1MCwicGF0aCI6Ii80MDUzOTAzLzMzOTkxMjU5Ny05MWRiOTIwYS1jNjI1LTQwMmQtYTg2MS0yOTA0ZmY3MWM0ZjIucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI0MDYyNiUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNDA2MjZUMTI0MDUwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9MmFhNDRkNjg1ZDg4ZDM3NTJkNzU0YTU0ZTBmODI4YTllNTgyNjliOGU1YzU4NTFmZDVmZmRhZTcwNTYxNzJhOCZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QmYWN0b3JfaWQ9MCZrZXlfaWQ9MCZyZXBvX2lkPTAifQ.HyJY0-InXyxUi-vy_Vn31byYVD1pTI88pw0yQ8W6fng)
I would expect the Chat Completion log to be part of the span like other libraries (instructor/openai client).
Relevant log output
No response
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: