NAME

Langertha::Engine::Ollama - Ollama API

VERSION

version 0.305

SYNOPSIS

use Langertha::Engine::Ollama;

my $ollama = Langertha::Engine::Ollama->new(
    url          => $ENV{OLLAMA_URL},
    model        => 'llama3.3',
    system_prompt => 'You are a helpful assistant',
    context_size => 2048,
    temperature  => 0.5,
);

print $ollama->simple_chat('Say something nice');

my $embedding = $ollama->embedding($content);

# Get OpenAI-compatible API access to Ollama
my $ollama_openai = $ollama->openai;

# List available models
my $models = $ollama->simple_tags;

# Show running models
my $running = $ollama->simple_ps;

DESCRIPTION

Provides access to Ollama, which runs large language models locally. Ollama supports many popular open-source models including llama3.3 (default), qwen2.5, deepseek-coder-v2, mixtral, and mxbai-embed-large (default embedding model).

Supports chat, embeddings, streaming, MCP tool calling (OpenAI-compatible format), and an OpenAI-compatible API via "openai". Not all models support tool calling; known working models include qwen3:8b and llama3.2:3b.

For Hermes-format tool calling in models without API-level tool support, compose Langertha::Role::HermesTools. See Langertha::Role::HermesTools for details.

THIS API IS WORK IN PROGRESS

openai

my $oai = $ollama->openai;
my $oai = $ollama->openai(model => 'different_model');

Returns a Langertha::Engine::OllamaOpenAI instance configured for Ollama's /v1 OpenAI-compatible endpoint, inheriting the current model, embedding model, system prompt, and temperature settings. Supports streaming, embeddings, and MCP tool calling.

new_openai

my $oai = Langertha::Engine::Ollama->new_openai(
    url   => 'http://localhost:11434',
    model => 'llama3.3',
    tools => \@mcp_tools,
);

Class method. Constructs a native Ollama engine and immediately returns an Langertha::Engine::OllamaOpenAI instance from its openai() method. The optional tools list is passed to openai().

json_format

When set to a true value, passes format = 'json'> to the Ollama API, requesting JSON-formatted output from the model. Defaults to 0.

tags

my $request = $ollama->tags;

Returns an HTTP request object for the Ollama GET /api/tags endpoint. Execute it with simple_tags or pass it to an async HTTP client.

simple_tags

my $models = $ollama->simple_tags;
# Returns: [{name => 'llama3.3', model => 'llama3.3', ...}, ...]

Synchronously fetches and returns the list of locally available models from the Ollama /api/tags endpoint. Also updates the engine's models list.

ps

my $request = $ollama->ps;

Returns an HTTP request object for the Ollama GET /api/ps endpoint which lists currently loaded (running) models.

simple_ps

my $running = $ollama->simple_ps;
# Returns: [{name => 'llama3.3', ...}, ...]

Synchronously fetches and returns the list of models currently loaded in Ollama's memory from the /api/ps endpoint.

list_models

my $model_ids = $ollama->list_models;
my $models    = $ollama->list_models(full => 1);
my $models    = $ollama->list_models(force_refresh => 1);

Fetches locally available models from Ollama via "simple_tags" with caching. Returns an ArrayRef of model name strings by default, or full model objects when full = 1> is passed. Results are cached for models_cache_ttl seconds (default: 3600).

SEE ALSO

SUPPORT

Issues

Please report bugs and feature requests on GitHub at https://github.com/Getty/langertha/issues.

CONTRIBUTING

Contributions are welcome! Please fork the repository and submit a pull request.

AUTHOR

Torsten Raudssus <torsten@raudssus.de> https://raudss.us/

COPYRIGHT AND LICENSE

This software is copyright (c) 2026 by Torsten Raudssus.

This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.