← Back to fiachehr.ir
Fiachehr · Package docs

Laravel AI Gateway

Unified AI gateway for Laravel with a fluent API, provider switching, normalized DTOs, and resilient logging. Supports OpenAI, Gemini, DeepSeek, Claude, and Ollama.

Features

  • Fluent API: AI::provider()->model()->systemPrompt()->userPrompt()->send().
  • Lower-level manager API for service-layer integration.
  • Standardized DTOs for requests, responses, and model metadata.
  • Provider factory pattern for clean provider resolution.
  • Database logging through Eloquent with isolated log failure behavior.
  • Consistent exception hierarchy across all providers.

Requirements

  • PHP 8.3+
  • Laravel 10 / 11 / 12
  • HTTP client via Laravel Http facade

Architecture

The package uses a compact, SOLID-friendly architecture:

  • Contracts: provider contract, model-listable capability, logger contract
  • DTOs: request/response/model data objects
  • Factory: provider resolution by key
  • Services: manager, builder, executor, log service
  • Providers: adapter class per AI provider
  • Repository: log persistence through Eloquent model

Installation

composer require fiachehr/laravel-ai-gateway
php artisan vendor:publish --tag=aigateway-config
php artisan vendor:publish --tag=aigateway-migrations
php artisan migrate

Configuration

Main file: config/aigateway.php

return [
    'default_provider' => env('AI_GATEWAY_DEFAULT_PROVIDER', 'openai'),
    'default_model' => env('AI_GATEWAY_DEFAULT_MODEL', 'gpt-5-mini'),
    'timeout' => (int) env('AI_GATEWAY_TIMEOUT', 60),
    'providers' => [
        'openai' => [
            'base_url' => env('OPENAI_BASE_URL', 'https://api.openai.com/v1'),
            'api_key' => env('OPENAI_API_KEY'),
        ],
        'gemini' => [
            'base_url' => env('GEMINI_BASE_URL', 'https://generativelanguage.googleapis.com/v1beta'),
            'api_key' => env('GEMINI_API_KEY'),
        ],
        // deepseek, claude, ollama...
    ],
];

Usage

Fluent API

use Fiachehr\AiGateway\Facades\AI;

$response = AI::provider('openai')
    ->model('gpt-5-mini')
    ->systemPrompt('You are a concise technical assistant.')
    ->userPrompt('Write a short welcome message.')
    ->send();

$text = $response->responseText;

Compatibility with old style

$response = AI::provider('gemini')
    ->model('gemini-2.5-flash')
    ->systemPrompt('Respond in Persian.')
    ->prompt('Summarize this text.')
    ->send();

Service API

use Fiachehr\AiGateway\Services\AIManager;

$response = app(AIManager::class)->send(
    provider: 'deepseek',
    model: 'deepseek-chat',
    prompt: 'Explain SOLID briefly.',
    systemPrompt: 'Keep it short and practical.'
);

Model listing

$models = AI::listModels('openai');

If a provider does not support model listing, the package throws a capability exception.

Logging

All requests and responses are logged into ai_logs:

  • provider, model, prompt
  • request_payload, response_payload, response_text
  • status, error_message, request_id
  • input_tokens, output_tokens, total_tokens
  • estimated_cost, latency_ms, metadata
Important: Logging errors are isolated and do not interrupt the main AI request flow.

Error handling

Use package exceptions for unified handling:

use Fiachehr\AiGateway\Exceptions\AIException;
use Fiachehr\AiGateway\Exceptions\ProviderRequestException;

try {
    $response = AI::provider('claude')
        ->model('claude-3-5-sonnet')
        ->prompt('Hello')
        ->send();
} catch (ProviderRequestException $e) {
    report($e);
} catch (AIException $e) {
    report($e);
}

Adding new providers

  1. Create a provider adapter implementing AIProviderContract.
  2. Optionally implement ModelListableContract if model listing is supported.
  3. Add mapping in ProviderFactory.

Testing

composer install
./vendor/bin/phpunit -c phpunit.xml

Security

  • Store all API keys in .env only.
  • Avoid sending highly sensitive raw data in prompts.
  • Use secure endpoints and proper timeout/retry settings.