vllm

package
v1.19.6 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 4, 2025 License: Apache-2.0 Imports: 5 Imported by: 0

Documentation

Overview

Package vllm is the vllm llm provider

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Provider

type Provider struct {
	// vllm OpenAI compatibility api endpoint
	APIEndpoint string
	// APIKey is the API key for vllm
	APIKey string
	// Model is the model for vllm
	// eg. "meta-llama/Llama-3.2-7B-Instruct"
	Model string
	// contains filtered or unexported fields
}

Provider is the provider for vllm

func NewProvider

func NewProvider(apiEndpoint string, apiKey string, model string) *Provider

NewProvider creates a new vllm ai provider

func (*Provider) GetChatCompletions

func (p *Provider) GetChatCompletions(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (openai.ChatCompletionResponse, error)

GetChatCompletions implements ai.LLMProvider.

func (*Provider) GetChatCompletionsStream

func (p *Provider) GetChatCompletionsStream(ctx context.Context, req openai.ChatCompletionRequest, _ metadata.M) (provider.ResponseRecver, error)

GetChatCompletionsStream implements ai.LLMProvider.

func (*Provider) Name

func (p *Provider) Name() string

Name returns the name of the provider

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL