Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OLLAMA Python, Why can't I use all the threads on the CPU? #167

Open
HamzaYslmn opened this issue May 23, 2024 · 1 comment
Open

OLLAMA Python, Why can't I use all the threads on the CPU? #167

HamzaYslmn opened this issue May 23, 2024 · 1 comment

Comments

@HamzaYslmn
Copy link

HamzaYslmn commented May 23, 2024

image

I analyzed the problem in depth. I get faster responses when I use the terminal, something is wrong with Python. Just use E-cores and its too slow.

I HAVE:

i9 - 13980HX
24 Core - 32 Thread

@HamzaYslmn
Copy link
Author

When I test with the GPU, there is a 30% speed difference between running from Python and running from Terminal. Why is this problem happening?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant