You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to run locally lamma3 on ubuntu 20.04.
I installed everything and it all seem to be working.
Running ollama run llama3:8b let me chat with him. And running ollama serve seems to work.
I tried coppying this code:
import ollama
response = ollama.chat(model='llama3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
But I get an error:
httpx.ConnectError: [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1131)
If you need any more information let me know.
Thank you :)
The text was updated successfully, but these errors were encountered:
Hello :)
I'm trying to run locally lamma3 on ubuntu 20.04.
I installed everything and it all seem to be working.
Running
ollama run llama3:8b
let me chat with him. And runningollama serve
seems to work.I tried coppying this code:
But I get an error:
httpx.ConnectError: [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1131)
If you need any more information let me know.
Thank you :)
The text was updated successfully, but these errors were encountered: