Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU Selection & Device Specification #161

Open
nycameraguy opened this issue May 20, 2024 · 1 comment
Open

GPU Selection & Device Specification #161

nycameraguy opened this issue May 20, 2024 · 1 comment

Comments

@nycameraguy
Copy link

I would like to suggest a feature that would allow specifing which GPU or GPUs to run on directly within the Ollama Python library.

This feature is crucial in shared server environments across multiple GPUs and multiple users, as it allows each Jupyter notebook to run on the corresponding GPU without conflicts. Currently, specifying GPU usage in Ollama is somewhat complex. A streamlined method to assign tasks to specific GPUs directly inside the Python program would prevent conflicts and optimize workflow. Implementing this feature would significantly improve usability and align Ollama with other machine-learning frameworks.

Thank you for considering this suggestion. I would be happy to discuss further details if needed.

@pdevine pdevine transferred this issue from ollama/ollama May 20, 2024
@gileneusz
Copy link

would be great if this could be implemented asap

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants