Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: ollama-js adapter to work locally #15

Open
otrebu opened this issue Feb 21, 2024 · 10 comments
Open

Feature request: ollama-js adapter to work locally #15

otrebu opened this issue Feb 21, 2024 · 10 comments

Comments

@otrebu
Copy link

otrebu commented Feb 21, 2024

Create a ollama-js adapter to have the option to use ollama instead of OpenAI.

If you are not working on this I might give it a go.

@davidkpiano
Copy link
Member

@otrebu I would gladly welcome you to give this a try - it should be as simple as copying from the OpenAI adapter.

@otrebu
Copy link
Author

otrebu commented Feb 26, 2024

I started on it, but I have a question: do you have any suggestions in how to deal with tools? Ollama doesn't currently seem to support the same functionality/API for that.

@davidkpiano
Copy link
Member

I started on it, but I have a question: do you have any suggestions in how to deal with tools? Ollama doesn't currently seem to support the same functionality/API for that.

Let's just stub this for now.

@otrebu
Copy link
Author

otrebu commented Mar 17, 2024

I have done my best so far, a bit stuck on some Typescript types.

What is it best to do if I need a little hand? Still open a pull request?

This is the commit on the fork: otrebu@334b077

I haven't tested the code yet.

@davidkpiano
Copy link
Member

Yes please @otrebu, open a pull request and I will gladly work on this with you.

@otrebu
Copy link
Author

otrebu commented Mar 17, 2024

Amazing thanks @davidkpiano .

Hopefully I created it correctly: #24 ( first one ever for me 😄 )

@otrebu
Copy link
Author

otrebu commented May 19, 2024

@davidkpiano did you have a chance to have a look? Otherwise I will try again after I rebase.

@davidkpiano
Copy link
Member

@davidkpiano did you have a chance to have a look? Otherwise I will try again after I rebase.

I have recently added the Vercel AI SDK, which abstracts multiple models. I will see if/how ollama can be used with that, but it's a much more scalable solution than trying to build our own adapters 😅

@otrebu
Copy link
Author

otrebu commented May 19, 2024

Oh ok, thank you! Sounds good.

@airtonix
Copy link

airtonix commented Jun 28, 2024

yep we don't need this.

this exists :

import { createAgent } from '@stately/agent';
import { createOllama } from 'ollama-ai-provider';

const ollama = createOllama({ baseURL: process.env.MY_AMAZEBALLS_OLLAMA_BASEURL });

// found with ollama --list
const myCustomOllamaModelName = process.env.MY_AMAZEBALLS_OLLAMA_MODELNAME || 'mixtral';

const model = ollama(myCustomOllamaModelName');

const agent = createAgent({
  name: 'my-foo-bar-agent',
  model,
  events: {
   	//... my amazing events
  },
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants