Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: "GET /v2/model/info HTTP/1.1" 500 Internal Server Error #4143

Open
Lh111d opened this issue Jun 12, 2024 · 2 comments
Open

[Bug]: "GET /v2/model/info HTTP/1.1" 500 Internal Server Error #4143

Lh111d opened this issue Jun 12, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@Lh111d
Copy link

Lh111d commented Jun 12, 2024

What happened?

A bug happened!When I use the ui to view the model function, the message {"detail":{"error":"Invalid llm model list. llm_model_list=None"}} is displayed. How do I add the llm model

Relevant log output

"GET /v2/model/info HTTP/1.1" 500 Internal Server Error

Twitter / LinkedIn details

No response

@Lh111d Lh111d added the bug Something isn't working label Jun 12, 2024
@ishaan-jaff
Copy link
Contributor

hi @Lh111d

  • what litellm version are you on?
  • how do you add models on the proxy through the config.yaml or did you directly add it on the UI

@DennyDai
Copy link

I had the same issue after updating to the latest version and I believe it's related to 3c6bc03

Once I manually add command: ["--config", "/app/config.yaml", "--port", "4000"] to my docker-compose.yml it works

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants