-
Notifications
You must be signed in to change notification settings - Fork 276
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add tools to api_server for InternLM2 model #1763
base: main
Are you sure you want to change the base?
Conversation
Conflicts: lmdeploy/serve/openai/api_server.py
Pls fix the UT test error and the PR test error |
docs/en/serving/api_server.md
Outdated
} | ||
] | ||
messages = [{"role": "user", "content": "What's the weather like in Boston today?"}] | ||
tool_choice={"type": "function", "function": {"name": "get_current_weather"}} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we support tool_choice
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only supports none
or specifying a particular tool in json format.
@@ -434,10 +451,21 @@ async def chat_completions_v1(request: ChatCompletionRequest, | |||
stop_words=request.stop, | |||
skip_special_tokens=request.skip_special_tokens) | |||
|
|||
tools = None | |||
if request.tools and request.tool_choice != 'none': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we make error response to the client when request.tool_choice is not None?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've made it in the check_request
function.
Do we support |
No, we did not yet. |
@Harold-lkk May help review this PR。 |
I got this prompt string if I used the code snippet in our doc.
And only internlm2-chat1_8b return the desired output:
|
It turns out that we have to pass |
No description provided.