Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python: Add OpenTelemetry to Python SK #6914

Open
wants to merge 11 commits into
base: main
Choose a base branch
from

Conversation

glahaye
Copy link
Contributor

@glahaye glahaye commented Jun 23, 2024

Motivation and Context

We want observability into usage of SK

Description

Add OpenTelemetry to Python SK

Contribution Checklist

@glahaye glahaye requested a review from a team as a code owner June 23, 2024 23:12
@markwallace-microsoft markwallace-microsoft added the python Pull requests for the Python Semantic Kernel label Jun 23, 2024
@github-actions github-actions bot changed the title Add OpenTelemetry to Python SK Python: Add OpenTelemetry to Python SK Jun 23, 2024
@markwallace-microsoft
Copy link
Member

markwallace-microsoft commented Jun 23, 2024

Py3.10 Test Coverage

Python 3.10 Test Coverage Report •
FileStmtsMissCoverMissing
semantic_kernel/connectors/ai/open_ai/services
   open_ai_chat_completion_base.py1687058%100, 104, 124, 149–153, 178, 186, 188, 192, 210–215, 233–261, 264–275, 290–297, 308–316, 332–339, 360, 368, 374–377, 389–392, 423
TOTAL668177388% 

Python 3.10 Unit Test Overview

Tests Skipped Failures Errors Time
1567 1 💤 0 ❌ 0 🔥 26.008s ⏱️

python/poetry.lock Outdated Show resolved Hide resolved
Copy link
Member

@eavanvalkenburg eavanvalkenburg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some work to do, the basics are there, but most important is to refactor this into a decorator that we can use with all completion services and functions!

python/semantic_kernel/utils/model_diagnostics.py Outdated Show resolved Hide resolved
python/semantic_kernel/utils/model_diagnostics.py Outdated Show resolved Hide resolved
python/semantic_kernel/utils/model_diagnostics.py Outdated Show resolved Hide resolved
@@ -269,9 +276,25 @@ def _chat_message_content_to_dict(self, message: "ChatMessageContent") -> dict[s

async def _send_chat_request(self, settings: OpenAIChatPromptExecutionSettings) -> list["ChatMessageContent"]:
"""Send the chat request."""
span = model_diagnostics.start_completion_activity(settings.ai_model_id, MODEL_PROVIDER_NAME,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should put all this and the bottom part into a single decorator so that we can universally apply this across all connectors!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know how feasible or convoluted this can be to achieve as there will be differences between connectors. I am certainly checking right now.

Does this need to be part of this PR, or can it be in a follow-up one?

{COMPLETION_EVENT_COMPLETION: _messages_to_openai_format(completions)})


def _messages_to_openai_format(chat_history: list[ChatMessageContent]) -> str:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ideally this generic function should not have openai specific functions...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is needed since the OTeL GenAI semantic convention recommends the OpenAI format: https://github.com/open-telemetry/semantic-conventions/blob/main/docs/gen-ai/gen-ai-spans.md#events

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then let's name it _messages_to_otel_format!

python/semantic_kernel/utils/model_diagnostics.py Outdated Show resolved Hide resolved
Copy link
Member

@eavanvalkenburg eavanvalkenburg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see both naming with diagnostics (env setting and model_diagnostics.py file) and tracing, let's make that more consistent! And we really need to create a decorator out of the code inside openai_chat_completion_base!

@@ -279,9 +288,39 @@ def _chat_message_content_to_dict(self, message: "ChatMessageContent") -> dict[s

async def _send_chat_request(self, settings: OpenAIChatPromptExecutionSettings) -> list["ChatMessageContent"]:
"""Send the chat request."""
response = await self._send_request(request_settings=settings)
span = start_completion_activity(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this still needs to be put into a decorator, I don't like this code here!

@@ -301,9 +340,18 @@ async def _send_chat_stream_request(
# endregion
# region content creation

def _input_messages_to_prompt(self, messages: list[dict[str, Any]]) -> str:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same for this, we have multiple ways of create a list of dict from messages, and settings have a list of dicts with the messages already here, i would argue that that is the list we want to log!

{COMPLETION_EVENT_COMPLETION: _messages_to_openai_format(completions)})


def _messages_to_openai_format(chat_history: list[ChatMessageContent]) -> str:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then let's name it _messages_to_otel_format!

@@ -0,0 +1,160 @@
# Copyright (c) Microsoft. All rights reserved.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this probably should also move into the utils/tracing folder.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
python Pull requests for the Python Semantic Kernel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants