NAME
Langertha::Engine::Cerebras - Cerebras Inference API
VERSION
version 0.302
SYNOPSIS
use Langertha::Engine::Cerebras;
my $cerebras = Langertha::Engine::Cerebras->new(
api_key => $ENV{CEREBRAS_API_KEY},
model => 'llama-3.3-70b',
);
print $cerebras->simple_chat('Hello from Perl!');
DESCRIPTION
Provides access to Cerebras Inference, the fastest AI inference platform. Composes Langertha::Role::OpenAICompatible with Cerebras's endpoint (https://api.cerebras.ai/v1) and API key handling.
Cerebras uses custom wafer-scale chips to deliver extremely fast inference speeds. Available models include llama3.1-8b (default), qwen-3-235b-a22b-instruct-2507, and gpt-oss-120b.
Supports chat, streaming, and MCP tool calling. Embeddings and transcription are not supported.
Get your API key at https://cloud.cerebras.ai/ and set LANGERTHA_CEREBRAS_API_KEY in your environment.
THIS API IS WORK IN PROGRESS
SEE ALSO
https://status.cerebras.ai/ - Cerebras service status
https://inference-docs.cerebras.ai/ - Cerebras Inference documentation
Langertha::Role::OpenAICompatible - OpenAI API format role
SUPPORT
Issues
Please report bugs and feature requests on GitHub at https://github.com/Getty/langertha/issues.
CONTRIBUTING
Contributions are welcome! Please fork the repository and submit a pull request.
AUTHOR
Torsten Raudssus <torsten@raudssus.de> https://raudss.us/
COPYRIGHT AND LICENSE
This software is copyright (c) 2026 by Torsten Raudssus.
This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.