Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bug about the web ui and the llm model. #1711

Open
idiotTest opened this issue Jun 24, 2024 · 0 comments
Open

A bug about the web ui and the llm model. #1711

idiotTest opened this issue Jun 24, 2024 · 0 comments
Labels
Milestone

Comments

@idiotTest
Copy link

Describe the bug

I found a bug is that the gpu memory shows the xinf and the llm model are running,but i can't see the model in web ui.Also,I can't use the llm model.

To Reproduce

At begining,I run qwen1.5-14b-chat in a linux with xinf,llvm.And i do a test,I want to do a stress test to test some indicators.At first time,it seems ok.But when i run again,I found the bug.
Also,the customer model registered by user is also lost

image
image
image

###Some guesses
The gpu memory usage exceeds and cause the error?

@XprobeBot XprobeBot added the gpu label Jun 24, 2024
@XprobeBot XprobeBot modified the milestones: v0.12.2, v0.12.4 Jun 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants