NAME

Langertha::Engine::OllamaOpenAI - Ollama via OpenAI-compatible API

VERSION

version 0.202

SYNOPSIS

use Langertha::Engine::OllamaOpenAI;

# Direct construction (url with /v1 suffix is required)
my $ollama_oai = Langertha::Engine::OllamaOpenAI->new(
    url   => 'http://localhost:11434/v1',
    model => 'llama3.3',
);

print $ollama_oai->simple_chat('Hello!');

# Streaming
$ollama_oai->simple_chat_stream(sub {
    print shift->content;
}, 'Tell me about Perl');

# Preferred: create via Ollama's openai() method (appends /v1 automatically)
use Langertha::Engine::Ollama;

my $ollama = Langertha::Engine::Ollama->new(
    url   => 'http://localhost:11434',
    model => 'llama3.3',
);
my $oai = $ollama->openai;
print $oai->simple_chat('Hello via OpenAI format!');

DESCRIPTION

Provides access to Ollama's OpenAI-compatible /v1 API endpoint. Composes Langertha::Role::OpenAICompatible for the standard OpenAI format.

url is required and must include the /v1 path prefix (e.g., http://localhost:11434/v1). When using "openai" in Langertha::Engine::Ollama, the /v1 suffix is appended automatically. The API key defaults to 'ollama' since Ollama does not require authentication.

Supports chat completions (SSE streaming), embeddings (default: mxbai-embed-large), MCP tool calling, and dynamic model listing. Transcription is not supported.

For the native Ollama API with keep_alive, seed, context_size, NDJSON streaming, and Hermes tool calling, use Langertha::Engine::Ollama.

THIS API IS WORK IN PROGRESS

SEE ALSO

SUPPORT

Issues

Please report bugs and feature requests on GitHub at https://github.com/Getty/langertha/issues.

CONTRIBUTING

Contributions are welcome! Please fork the repository and submit a pull request.

AUTHOR

Torsten Raudssus <torsten@raudssus.de> https://raudss.us/

COPYRIGHT AND LICENSE

This software is copyright (c) 2026 by Torsten Raudssus.

This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.