NAME
Langertha::Engine::LMStudioOpenAI - LM Studio via OpenAI-compatible API
VERSION
version 0.305
SYNOPSIS
use Langertha::Engine::LMStudioOpenAI;
my $lm_oai = Langertha::Engine::LMStudioOpenAI->new(
url => 'http://localhost:1234/v1',
model => 'qwen2.5-7b-instruct-1m',
);
print $lm_oai->simple_chat('Hello from OpenAI-compatible endpoint');
DESCRIPTION
Adapter for LM Studio's OpenAI-compatible local endpoint (/v1/chat/completions, /v1/models, /v1/embeddings).
Authentication is optional. If api_key (or LANGERTHA_LMSTUDIO_API_KEY) is set, it is sent as a bearer token.
THIS API IS WORK IN PROGRESS
api_key
Optional bearer token for LM Studio's OpenAI-compatible endpoint. If not provided, reads from LANGERTHA_LMSTUDIO_API_KEY and otherwise defaults to lmstudio.
model
Chat model name. Defaults to default. For real requests, set this to an actually loaded LM Studio model key (for example qwen2.5-7b-instruct-1m).
SEE ALSO
Langertha::Engine::LMStudio - Native LM Studio API
Langertha::Engine::OpenAIBase - Base class for OpenAI-compatible engines
SUPPORT
Issues
Please report bugs and feature requests on GitHub at https://github.com/Getty/langertha/issues.
CONTRIBUTING
Contributions are welcome! Please fork the repository and submit a pull request.
AUTHOR
Torsten Raudssus <torsten@raudssus.de> https://raudss.us/
COPYRIGHT AND LICENSE
This software is copyright (c) 2026 by Torsten Raudssus.
This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.