Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(anthropic): add instrumentation for Anthropic tool calling #1372

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

peachypeachyy
Copy link

@peachypeachyy peachypeachyy commented Jun 21, 2024

Fixes #879

  • I have added tests that cover my changes.
  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.
  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....
  • (If applicable) I have updated the documentation accordingly.

@nirga nirga changed the title feat(Support Anthropic Tools): Adds Functionality for Supporting Anthropic Tools feat(anthropic): add functionality for Anthropic tool calling Jun 21, 2024
@nirga nirga changed the title feat(anthropic): add functionality for Anthropic tool calling feat(anthropic): add instrumentation for Anthropic tool calling Jun 21, 2024
Copy link
Member

@nirga nirga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this @peachypeachyy! Can you also add instrumentation for the tool use that anthropic returns?

@peachypeachyy
Copy link
Author

@nirga The response of tool_use as the finish reason is already captured here:

When you say instrumentation of tool_use, do you mean setting the response attributes of the Chain of Thought? i.e setting response attributes to something like this:

{
      "type": "tool_use",
      "id": "toolu_01A09q90qw90lq917835lq9",
      "name": "get_weather",
      "input": {"location": "San Francisco, CA", "unit": "celsius"}
}

@peachypeachyy
Copy link
Author

@nirga if I have to set the response attributes as mentioned above? Which spans do I use?

I ask because the only response attribute within the opentelemetry-semconv-ai package I have is LLM_RESPONSE_MODEL

I do see LLM_RESPONSE_FINISH_REASON and LLM_RESPONSE_ID commented out as seen below, but I'm not sure If I can uncomment and use them or create new ones such as LLM_RESPONSE_TOOLS because the code for this is in the opentelemetry.semconv.ai package and not in our openllmetry codebase. correct me if I am mistaken here. 😄

class SpanAttributes:
    # Semantic Conventions for LLM requests, this needs to be removed after
    # OpenTelemetry Semantic Conventions support Gen AI.
    # Issue at https://github.com/open-telemetry/opentelemetry-python/issues/3868
    # Refer to https://github.com/open-telemetry/semantic-conventions/blob/main/docs/gen-ai/llm-spans.md
    # for more detail for LLM spans from OpenTelemetry Community.
    LLM_SYSTEM = "gen_ai.system"
    LLM_REQUEST_MODEL = "gen_ai.request.model"
    LLM_REQUEST_MAX_TOKENS = "gen_ai.request.max_tokens"
    LLM_REQUEST_TEMPERATURE = "gen_ai.request.temperature"
    LLM_REQUEST_TOP_P = "gen_ai.request.top_p"
    LLM_PROMPTS = "gen_ai.prompt"
    LLM_COMPLETIONS = "gen_ai.completion"
    LLM_RESPONSE_MODEL = "gen_ai.response.model"
    LLM_USAGE_COMPLETION_TOKENS = "gen_ai.usage.completion_tokens"
    LLM_USAGE_PROMPT_TOKENS = "gen_ai.usage.prompt_tokens"
    # To be added
    # LLM_RESPONSE_FINISH_REASON = "gen_ai.response.finish_reasons"
    # LLM_RESPONSE_ID = "gen_ai.response.id"

@nirga
Copy link
Member

nirga commented Jun 22, 2024

@peachypeachyy this should be similar to OpenAI. You're right btw, we're missing these from the semantic conventions.

@peachypeachyy
Copy link
Author

@nirga done, I have captured spans for tool_use in response as well, I have used SpanAttributes.LLM_COMPLETIONS as that is what was used in the OpenAI code.

if response.get("stop_reason"):
set_span_attribute(span, f"{SpanAttributes.LLM_COMPLETIONS}.stop_reason", response.get("stop_reason"))

if response.get("content"):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @peachypeachyy this is already happening in _set_span_completions. What I meant is editing the code there to specify the tool selection (if a tool was selected) - and following the similar conventions we have in the OpenAI instrumentation

Copy link
Author

@peachypeachyy peachypeachyy Jun 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @nirga let me step you through my thought process.

Following is the request being sent:

response = client.messages.create(
    model="claude-3-opus-20240229",
    max_tokens=1024,
    tools=[
        {
            "name": "get_weather",
            "description": "Get the current weather in a given location",
            "input_schema": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA"
                    },
                    "unit": {
                        "type": "string",
                        "enum": ["celsius", "fahrenheit"],
                        "description": "The unit of temperature, either 'celsius' or 'fahrenheit'"
                    }
                },
                "required": ["location"]
            }
        },
        {
            "name": "get_time",
            "description": "Get the current time in a given time zone",
            "input_schema": {
                "type": "object",
                "properties": {
                    "timezone": {
                        "type": "string",
                        "description": "The IANA time zone name, e.g. America/Los_Angeles"
                    }
                },
                "required": ["timezone"]
            }
        }
    ],
    messages=[
        {
            "role": "user",
            "content": "What is the weather like right now in New York City, NY?"
        }
    ]
)

The corresponding formatted response is:

{'content': [TextBlock(text='<thinking>\nThe get_weather function is relevant to answer this question, as it provides the current weather for a given location.\n\nIt requires the "location" parameter, which in this case is provided by the user as "New York City, NY". \n\nThe "unit" parameter is optional. The user did not specify a preference, so we can use the default.\n\nSince all required parameters are available, we can proceed with calling the get_weather function to get the current weather in New York City.\n</thinking>', type='text'),
             ToolUseBlock(id='toolu_013gp9gPBaYhPVLMeq4N34yU', input={'location': 'New York City, NY'}, name='get_weather', type='tool_use')],
 'id': 'msg_01H62BKGLJBD8zstmvW21Cp4',
 'model': 'claude-3-opus-20240229',
 'role': 'assistant',
 'stop_reason': 'tool_use',
 'stop_sequence': None,
 'type': 'message',
 'usage': Usage(input_tokens=748, output_tokens=165)}

In this case, even though we have 2 functions passed in tools, get_weather and get_time, claude's opus determined to use get_weather as seen in the ToolUseBlock.

I have captured this tool get_weather which is being used within ToolUseBlock using f"{SpanAttributes.LLM_COMPLETIONS}.{i}.name".

Now, which tool has to be used is governed by the parameter tool_choice which needs to be set in the request. I have not set it in this example, by default the value is auto. Do you need me to set spans for this within the request? something like f"{SpanAttributes.LLM_REQUEST_FUNCTIONS}.{i}.tool_choice == auto

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@peachypeachyy can you look at set_span_completions? It's also setting the same attributes you're setting here. You should only add the tool choice logging, similar to OpenAI

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

🚀 Feature: Support Anthropic's Tools
2 participants