Skip to main content

OpenAIProvider

openmolt


openmolt / OpenAIProvider

Class: OpenAIProvider

Defined in: providers/OpenAIProvider.ts:23

OpenAI LLM provider. Supports chat-completion models (GPT-4o, GPT-3.5, etc.) and reasoning models (o1, o3 family).

Extends

Constructors

Constructor

new OpenAIProvider(apiKey, baseUrl?): OpenAIProvider

Defined in: providers/OpenAIProvider.ts:30

Parameters

apiKey

string

OpenAI API key.

baseUrl?

string

Override for the API base URL (useful for proxies / Azure).

Returns

OpenAIProvider

Overrides

BaseProvider.constructor

Methods

generate()

generate(systemPrompt, userMessage, model, config?): Promise<LLMResponse>

Defined in: providers/OpenAIProvider.ts:39

Send a prompt to the underlying LLM and return a normalised response.

Parameters

systemPrompt

string

The Maestro system prompt (static across iterations).

userMessage

string

The per-iteration input-state message.

model

string

Provider-specific model identifier (e.g. gpt-4o).

config?

ModelConfig

Optional model-level tuning parameters.

Returns

Promise<LLMResponse>

Overrides

BaseProvider.generate