Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

custom client seems working not as expected with apple silicon mac #159

Open
zhengxinMLOG opened this issue May 20, 2024 · 2 comments
Open

Comments

@zhengxinMLOG
Copy link

zhengxinMLOG commented May 20, 2024

I am trying ollama.Client to connect a remote server for chat.

server A: http://192.168.0.123:11434, ollama installed with docker, ollama-python v0.2.0
local machine: m1 max macbook pro, ollama installed with docker, ollama-python v0.2.0

when using default ollama.chat, i can connect in both environment;
then i tried to connect to server from local machine using

ollama.Client(host='http://192.168.0.123:11434')

and i received 503 error with '' message when commiting code

_client.chat(model="svjack/qwen1_5_14b:latest",
        messages=[
            {"role": "user", "content": prompt},
        ],
    ).

At first, i thought it might has something todo with nerwork, so i did some test.
i tried same script directly on server A, and it runs ok.
then i tried ollama webui on local machine, and i changed default url to server A, and it runs ok.(local machine and server A has a different model list, and it changes properly with config change, it proves.)
then it occurs to me that maybe it has something with mac develop envs.
and i tried the same script on the 3rd machine, it also runs ok.

right now i cant say for sure that the issues has something to do with apple silicon, but it is the best guess i can make.

@mh-malekpour
Copy link

How did you install ollama with docker on your MacBook?
Native Ollama does support Apple Silicon, but we can’t run it on Mac with docker since docker doesn’t support Metal Performance Shaders (MPS), and the only GPU library supported by Docker and viewable as hardware to the image is the Nvidia GPU

@zhengxinMLOG
Copy link
Author

sorry, i made some mistake here.
i installed ollama on macbook following the github mac recommendation, ie the zip installer.(i ran open-webui using docker, and i confused here)
the point is, i can run ollama.chat() on my mac properly, but when it comes to the ollama.Client():

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants