This PHP library provides a simple wrapper for the PerplexityAI API, allowing you to easily integrate the PerplexityAI API into your PHP projects.
- Easy integration with PerplexityAI API
- Supports all PerplexityAI API endpoints
- Streaming support for real-time responses in chat completions
- Utilizes PSR-17 and PSR-18 compliant HTTP clients and factories for making API requests
- PHP 8.1 or higher
- A PSR-17 HTTP Factory implementation (e.g., guzzle/psr7 or nyholm/psr7)
- A PSR-18 HTTP Client implementation (e.g., guzzlehttp/guzzle or symfony/http-client)
You can install the library via Composer:
composer require softcreatr/php-perplexity-ai-sdk
First, include the library in your project:
<?php
require_once 'vendor/autoload.php';
Then, create an instance of the PerplexityAI
class with your API key, an HTTP client, an HTTP request factory, and an HTTP stream factory:
use SoftCreatR\PerplexityAI\PerplexityAI;
$apiKey = 'your_api_key';
// Replace these lines with your chosen PSR-17 and PSR-18 compatible HTTP client and factories
$httpClient = new YourChosenHttpClient();
$requestFactory = new YourChosenRequestFactory();
$streamFactory = new YourChosenStreamFactory();
$uriFactory = new YourChosenUriFactory();
$pplx = new PerplexityAI($requestFactory, $streamFactory, $uriFactory, $httpClient, $apiKey);
Now you can call any supported PerplexityAI API endpoint using the magic method __call
:
$response = $pplx->createChatCompletion([
'model' => 'llama-3.1-sonar-small-128k-online',
'messages' => [
[
'role' => 'system',
'content' => 'Be precise and concise.'
],
[
'role' => 'user',
'content' => 'How many stars are there in our galaxy?'
]
],
]);
// Process the API response
if ($response->getStatusCode() === 200) {
$responseObj = json_decode($response->getBody()->getContents(), true);
print_r($responseObj);
} else {
echo "Error: " . $response->getStatusCode();
}
You can enable real-time streaming for chat completions:
$streamCallback = static function ($data) {
if (isset($data['choices'][0]['delta']['content'])) {
echo $data['choices'][0]['delta']['content'];
}
};
$pplx->createChatCompletion(
[
'model' => 'llama-3.1-sonar-small-128k-online',
'messages' => [
[
'role' => 'user',
'content' => 'Tell me a story about a brave knight.',
],
],
'stream' => true,
],
$streamCallback
);
For more details on how to use each endpoint, refer to the PerplexityAI API documentation, and the examples provided in the repository.
- Create Chat Completion - Example
createChatCompletion(array $options = [])
For a detailed list of changes and updates, please refer to the CHANGELOG.md file. We adhere to Semantic Versioning and document notable changes for each release.
Streaming is now supported for real-time token generation in chat completions. Please make sure you are handling streams correctly using a callback, as demonstrated in the examples.
This library is licensed under the ISC License. See the LICENSE file for more information.
Sascha Greuel |