Skip to content

A simple example of how to use Ollama and get a response in Laravel and Livewire. Logic is inside of resources/views/pages/index.blade.php

Notifications You must be signed in to change notification settings

thedevdojo/laravel-ollama

Repository files navigation

Laravel Ollama

This app uses Laravel, Livewire, and Volt to create a simple interface that generates a response from an AI model using Ollama.

Laravel Ollama Screenshot

Simply download and install Ollama. Then use it with any model, like so:

ollama pull codellama

This application will retreive the response in Laravel by hitting the following endpoint, which is available via Ollama:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "codellama",
  "prompt": "Write me a function that outputs the fibonacci sequence"
}'

For testing purposes you may also use the CLI to get a response:

ollama run codellama "Write me a function that outputs the fibonacci sequence"

About

A simple example of how to use Ollama and get a response in Laravel and Livewire. Logic is inside of resources/views/pages/index.blade.php

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages