-
Notifications
You must be signed in to change notification settings - Fork 279
Issues: xorbitsai/inference
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
ValueError: [address=127.0.0.1:37657, pid=3125985] User-specified max_model_len (4096) is greater than the derived max_model_len (seq_length=2048 or model_max_length=None in model's config.json). This may lead to incorrect model outputs or CUDA errors. Make sure the value is correct and within the model context size.
gpu
question
Further information is requested
[QUESTION] can't install "xinference[all]", No module named 'Cython'
question
Further information is requested
下载 glm4-chat 提示 Further information is requested
Distant resource does not have a Content-Length
question
QUESTION how to load mini cpmV 2.5 gguf model which has two model files with xinference?
question
Further information is requested
Previous Next
ProTip!
What’s not been updated in a month: updated:<2024-05-25.