-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llama2-7b回复错误与重复 #192
Comments
I think it's that we're not asking questions in the right way,because the model is on the end side and offline. |
I downloaded the model from the link provided by the author, Model download path: https://github.com/wangzhaode/mnn-llm/releases I'm not quite sure if it's a model format issue, I'm ready to try following the steps to convert onnx to mnn |
That's what I'm doing at the moment and can only find the problem by elimination . |
Can you let me know if you have any new progress or solve the problem?thanks |
Sorry, I don't have a solution for this at the moment...... |
I also don't have a solution either,This is really a headache inducing mistake |
您好,我编译了该工程并下载了llama2-7b-mnn 但是在使用的使用出现以下问题,
错误的回复,与重复的自我回复,请问这是什么原因,使用的命令是。 ./build/cli_demo ./llama2-7b-chat
The text was updated successfully, but these errors were encountered: