NAME

Langertha::Engine::HuggingFace - HuggingFace Inference Providers API

VERSION

version 0.301

SYNOPSIS

use Langertha::Engine::HuggingFace;

my $hf = Langertha::Engine::HuggingFace->new(
    api_key => $ENV{HF_TOKEN},
    model   => 'Qwen/Qwen2.5-7B-Instruct',
);

print $hf->simple_chat('Hello from Perl!');

# Access many models through one API
my $llama = Langertha::Engine::HuggingFace->new(
    api_key => $ENV{HF_TOKEN},
    model   => 'meta-llama/Llama-3.3-70B-Instruct',
);

DESCRIPTION

Provides access to HuggingFace Inference Providers, a unified API gateway for open-source models hosted on the HuggingFace Hub. The endpoint at https://router.huggingface.co/v1 is 100% OpenAI-compatible.

Model names use org/model format (e.g., Qwen/Qwen2.5-7B-Instruct, meta-llama/Llama-3.3-70B-Instruct). No default model is set; model must be specified explicitly.

Supports chat, streaming, and MCP tool calling. Embeddings and transcription are not supported.

Get your API token at https://huggingface.co/settings/tokens and set LANGERTHA_HUGGINGFACE_API_KEY in your environment.

THIS API IS WORK IN PROGRESS

SEE ALSO

SUPPORT

Issues

Please report bugs and feature requests on GitHub at https://github.com/Getty/langertha/issues.

CONTRIBUTING

Contributions are welcome! Please fork the repository and submit a pull request.

AUTHOR

Torsten Raudssus <torsten@raudssus.de> https://raudss.us/

COPYRIGHT AND LICENSE

This software is copyright (c) 2026 by Torsten Raudssus.

This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.