NAME

Langertha::Engine::AKI - AKI.IO native API

VERSION

version 0.202

SYNOPSIS

use Langertha::Engine::AKI;

my $aki = Langertha::Engine::AKI->new(
    api_key => $ENV{AKI_API_KEY},
    model   => 'llama3_8b_chat',
);

print $aki->simple_chat('Hello from Perl!');

# Get OpenAI-compatible API access
my $aki_openai = $aki->openai;
print $aki_openai->simple_chat('Hello via OpenAI format!');

DESCRIPTION

Provides access to AKI.IO's native API for running LLM inference. AKI.IO is a European AI model hub based in Germany; all inference runs on EU infrastructure, fully GDPR-compliant with no data leaving the EU.

The native API sends the API key as a key field in the JSON request body (not as an HTTP header). Supports synchronous chat, temperature and sampling controls, dynamic endpoint listing, and OpenAI-compatible access via "openai".

Streaming is not yet supported in the native API. For streaming, use the OpenAI-compatible endpoint via $aki->openai.

Get your API key at https://aki.io/ and set LANGERTHA_AKI_API_KEY.

THIS API IS WORK IN PROGRESS

api_key

The AKI.IO API key. If not provided, reads from LANGERTHA_AKI_API_KEY environment variable. Sent as a key field in the JSON request body (not as an HTTP header). Required.

top_k

top_k => 40

Top-K sampling parameter. Controls the number of highest-probability tokens to consider at each generation step.

top_p

top_p => 0.9

Top-P (nucleus) sampling parameter. Controls the cumulative probability threshold for token selection.

max_gen_tokens

max_gen_tokens => 1000

Maximum number of tokens to generate in the response.

list_models

my $endpoints = $aki->list_models;
my $endpoints = $aki->list_models(force_refresh => 1);

Fetches available endpoint names from the AKI.IO GET /api/endpoints API. Returns an ArrayRef of endpoint names. Results are cached for models_cache_ttl seconds (default: 3600).

endpoint_details

my $details = $aki->endpoint_details('llama3_8b_chat');
# Returns hashref with name, title, description, workers, parameter_description, etc.

Fetches detailed information about a specific endpoint from the AKI.IO GET /api/endpoints/{name} API. Returns worker info, model metadata, and parameter descriptions.

chat_request

my $request = $aki->chat_request($messages, %extra);

Generates a native AKI.IO chat request. Posts to /api/call/{model} with messages encoded as JSON in the chat_context field. Includes key, temperature, top_k, top_p, max_gen_tokens, and wait_for_result parameters as configured. Returns an HTTP request object.

chat_response

my $response = $aki->chat_response($http_response);

Parses a native AKI.IO chat response. Dies with an API error message if success is false. Returns a Langertha::Response with content, model, timing, and raw.

openai

my $oai = $aki->openai;
my $oai = $aki->openai(model => 'llama3-chat-8b');

Returns a Langertha::Engine::AKIOpenAI instance configured with the same API key, system prompt, and temperature. Supports streaming and MCP tool calling.

Note: The native AKI model name is not carried over automatically because the /v1 endpoint uses different model identifiers. If no model is passed, the AKIOpenAI default model is used and a warning is emitted. Pass model => '...' explicitly with a valid /v1 model name to suppress the warning.

SEE ALSO

SUPPORT

Issues

Please report bugs and feature requests on GitHub at https://github.com/Getty/langertha/issues.

CONTRIBUTING

Contributions are welcome! Please fork the repository and submit a pull request.

AUTHOR

Torsten Raudssus <torsten@raudssus.de> https://raudss.us/

COPYRIGHT AND LICENSE

This software is copyright (c) 2026 by Torsten Raudssus.

This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.