Skip to content

Ollama-Laravel is a Laravel package providing seamless integration with the Ollama API.

License

Notifications You must be signed in to change notification settings

cloudstudio/ollama-laravel

Repository files navigation

Ollama-Laravel Package

Ollama-Laravel is a Laravel package that provides a seamless integration with the Ollama API. It includes functionalities for model management, prompt generation, format setting, and more. This package is perfect for developers looking to leverage the power of the Ollama API in their Laravel applications.

If you use laravel 10.x, please use the following version V1.0.5

https://github.com/cloudstudio/ollama-laravel/releases/tag/v1.0.5

Installation

composer require cloudstudio/ollama-laravel

Configuration

php artisan vendor:publish --tag="ollama-laravel-config"

Published config file:

return [
    'model' => env('OLLAMA_MODEL', 'llama2'),
    'url' => env('OLLAMA_URL', 'http://127.0.0.1:11434'),
    'default_prompt' => env('OLLAMA_DEFAULT_PROMPT', 'Hello, how can I assist you today?'),
    'connection' => [
        'timeout' => env('OLLAMA_CONNECTION_TIMEOUT', 300),
    ],
];

Usage

Basic Usage

use Cloudstudio\Ollama\Facades\Ollama;

$response = Ollama::agent('You are a weather expert...')
    ->prompt('Why is the sky blue?')
    ->model('llama2')
    ->options(['temperature' => 0.8])
    ->stream(false)
    ->ask();

Vision Support

$response = Ollama::model('llava:13b')
    ->prompt('What is in this picture?')
    ->image(public_path('images/example.jpg')) 
    ->ask();

// "The image features a close-up of a person's hand, wearing bright pink fingernail polish and blue nail polish. In addition to the colorful nails, the hand has two tattoos – one is a cross and the other is an eye."

Chat Completion

$messages = [
    ['role' => 'user', 'content' => 'My name is Toni Soriano and I live in Spain'],
    ['role' => 'assistant', 'content' => 'Nice to meet you , Toni Soriano'],
    ['role' => 'user', 'content' => 'where I live ?'],
];

$response = Ollama::agent('You know me really well!')
    ->model('llama2')
    ->chat($messages);

// "You mentioned that you live in Spain."

Show Model Information

$response = Ollama::model('Llama2')->show();

Copy a Model

Ollama::model('Llama2')->copy('NewModel');

Delete a Model

Ollama::model('Llama2')->delete();

Generate Embeddings

$embeddings = Ollama::model('Llama2')->embeddings('Your prompt here');

Testing

pest

Changelog, Contributing, and Security

Credits

License

MIT License