-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python: Add OpenTelemetry to Python SK #6914
base: main
Are you sure you want to change the base?
Conversation
Python 3.10 Test Coverage Report •
Python 3.10 Unit Test Overview
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some work to do, the basics are there, but most important is to refactor this into a decorator that we can use with all completion services and functions!
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
@@ -269,9 +276,25 @@ def _chat_message_content_to_dict(self, message: "ChatMessageContent") -> dict[s | |||
|
|||
async def _send_chat_request(self, settings: OpenAIChatPromptExecutionSettings) -> list["ChatMessageContent"]: | |||
"""Send the chat request.""" | |||
span = model_diagnostics.start_completion_activity(settings.ai_model_id, MODEL_PROVIDER_NAME, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should put all this and the bottom part into a single decorator so that we can universally apply this across all connectors!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know how feasible or convoluted this can be to achieve as there will be differences between connectors. I am certainly checking right now.
Does this need to be part of this PR, or can it be in a follow-up one?
{COMPLETION_EVENT_COMPLETION: _messages_to_openai_format(completions)}) | ||
|
||
|
||
def _messages_to_openai_format(chat_history: list[ChatMessageContent]) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ideally this generic function should not have openai specific functions...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is needed since the OTeL GenAI semantic convention recommends the OpenAI format: https://github.com/open-telemetry/semantic-conventions/blob/main/docs/gen-ai/gen-ai-spans.md#events
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
then let's name it _messages_to_otel_format
!
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
…chat_completion_base.py Co-authored-by: Tao Chen <[email protected]>
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Outdated
Show resolved
Hide resolved
python/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see both naming with diagnostics (env setting and model_diagnostics.py file) and tracing, let's make that more consistent! And we really need to create a decorator out of the code inside openai_chat_completion_base!
@@ -279,9 +288,39 @@ def _chat_message_content_to_dict(self, message: "ChatMessageContent") -> dict[s | |||
|
|||
async def _send_chat_request(self, settings: OpenAIChatPromptExecutionSettings) -> list["ChatMessageContent"]: | |||
"""Send the chat request.""" | |||
response = await self._send_request(request_settings=settings) | |||
span = start_completion_activity( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this still needs to be put into a decorator, I don't like this code here!
@@ -301,9 +340,18 @@ async def _send_chat_stream_request( | |||
# endregion | |||
# region content creation | |||
|
|||
def _input_messages_to_prompt(self, messages: list[dict[str, Any]]) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same for this, we have multiple ways of create a list of dict from messages, and settings have a list of dicts with the messages already here, i would argue that that is the list we want to log!
{COMPLETION_EVENT_COMPLETION: _messages_to_openai_format(completions)}) | ||
|
||
|
||
def _messages_to_openai_format(chat_history: list[ChatMessageContent]) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
then let's name it _messages_to_otel_format
!
@@ -0,0 +1,160 @@ | |||
# Copyright (c) Microsoft. All rights reserved. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this probably should also move into the utils/tracing folder.
Motivation and Context
We want observability into usage of SK
Description
Add OpenTelemetry to Python SK
Contribution Checklist