NAME
Langertha::Engine::vLLM - vLLM inference server
VERSION
version 0.002
SYNOPSIS
use Langertha::Engine::vLLM;
my $vllm = Langertha::Engine::vLLM->new(
url => $ENV{VLLM_URL},
model => $ENV{VLLM_MODEL},
system_prompt => 'You are a helpful assistant',
);
print($vllm->simple_chat('Say something nice'));
DESCRIPTION
THIS API IS WORK IN PROGRESS
HOW TO INSTALL VLLM
https://docs.vllm.ai/en/latest/getting_started/installation.html
SUPPORT
Source Code
The code is open to the world, and available for you to hack on. Please feel free to browse it and play with it, or whatever. If you want to contribute patches, please send me a diff or prod me to pull from your repository :)
https://github.com/Getty/langertha
git clone https://github.com/Getty/langertha.git
AUTHOR
Torsten Raudssus <torsten@raudssus.de> https://raudss.us/
COPYRIGHT AND LICENSE
This software is copyright (c) 2024 by Torsten Raudssus.
This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself.