Data source: OpenAI
1Presentation
OpenAI provides large language models (LLMs) accessible via an API, enabling text generation, question answering, translation, summarization, and more.
The \Temma\Datasources\OpenAi data source provides easy access to OpenAI's Chat Completions API.
If you have correctly configured the OpenAI connection parameters, Temma automatically creates an object of type \Temma\Datasources\OpenAi, with which you can query language models. By convention, we'll assume that you've named this connection openai in the etc/temma.php file (see configuration documentation).
In controllers, the OpenAI connection is available by writing:
$openai = $this->openai;
In other objects managed by the dependency injection component, the OpenAI connection is accessible by writing:
$openai = $loader->dataSources->openai;
$openai = $loader->dataSources['openai'];
2Configuration
To use the OpenAI API, you need an API key.
In the etc/temma.php file (see configuration documentation), you declare the DSN (Data Source Name) used to connect to OpenAI.
The OpenAI connection DSN is written as:
openai://chat/MODEL/API_KEY
With MODEL the identifier of the language model to use (e.g. gpt-4o, gpt-4-turbo, etc.) and API_KEY your OpenAI API key.
Configuration example in etc/temma.php:
<?php
return [
'application' => [
'dataSources' => [
'openai' => 'openai://chat/gpt-4o/sk-proj-xxxxxxxxxxxxx',
],
],
];
3Unified calls
3.1Array-like access
// send a prompt and get the response
$response = $openai['What is the capital of France?'];
3.2Advanced method
// send a prompt and get the response
$response = $openai->read('What is the capital of France?');
// with a default value in case of error
$response = $openai->read(
'Translate to French: Hello world',
'Bonjour le monde'
);
// with a callback function in case of error
$response = $openai->read(
'Translate to French: Hello world',
function() {
return 'Fallback value';
}
);
4Options
The read() and get() methods accept a third parameter $options, an associative array to configure the call:
- system: (string) System prompt defining the assistant's behavior.
- messages: (array) Array of previous messages for multi-turn conversation (see next section).
- temperature: (float) Sampling temperature, between 0 and 2. Lower values make responses more deterministic, higher values make them more random.
- max_tokens: (int) Maximum number of tokens in the response.
Example:
$response = $openai->read('Explain photosynthesis', null, [
'system' => 'You are a biology teacher. Answer concisely.',
'temperature' => 0.3,
'max_tokens' => 500,
]);
5Multi-turn conversation
It is possible to carry out multi-turn conversations by providing the history of previous messages via the messages option. Each message is an associative array containing the keys role (user or assistant) and content.
Example:
$response = $openai->read('And the capital of Italy?', null, [
'system' => 'You are a geography assistant.',
'messages' => [
['role' => 'user', 'content' => 'What is the capital of France?'],
['role' => 'assistant', 'content' => 'The capital of France is Paris.'],
],
]);
// $response contains "The capital of Italy is Rome."