Skip to main content

Gemini Provider

Google Gemini AI models integration addon for AI Core.

Features

  • Google Gemini Integration - Full integration with Google's Gemini AI models via the Generative Language API
  • Model Selection - Choose from Gemini 3, 2.5, and 2.0 series models
  • Multi-Modal Support - Support for text, image, and document inputs
  • Streaming Responses - Real-time streaming for long-form content

Requirements

RequirementDetails
DependenciesAICore
PHP Version8.2+
Google CloudValid Google API key with Gemini API access

Installation

Enable via Admin Panel

  1. Log in as administrator
  2. Navigate to Settings > Addons
  3. Find Gemini Provider and click Enable

Enable via Command Line

php artisan module:enable GeminiAIProvider
php artisan migrate
note

AI Core must be installed and enabled before enabling this module.

Configuration

After enabling the module, configure the Gemini provider through AI Core > Providers. The module registers Google Gemini as an available provider with its models.

  • API Key - Your Google API key with Gemini access
  • Default Model - Select default Gemini model for requests

Supported Models

Gemini 3 Series (Latest)

ModelContextDescription
gemini-3-pro1M tokensMost intelligent model with frontier intelligence
gemini-3-flash1M tokensDefault model with fast performance and thinking
gemini-3-deep-think1M tokensAdvanced reasoning for complex problems

Gemini 2.5 Series (Stable)

ModelContextDescription
gemini-2.5-pro1M tokensSuperior reasoning with thinking mode
gemini-2.5-flash1M tokensPrice-performance optimized
gemini-2.5-flash-lite1M tokensMost cost-efficient option

Gemini 2.0 Series

ModelContextDescription
gemini-2.0-flash1M tokensFast multimodal model

Usage

This module is a provider addon for AI Core. Once enabled, it registers Google Gemini as an available AI provider and seeds its models automatically. There is no separate UI for this module -- all management is done through AI Core.

Setup

  1. After enabling the module, navigate to AI Core > Providers
  2. Gemini will appear as an available provider
  3. Add or edit the Gemini provider entry and enter your Google API key
  4. Click Test Connection to verify your API key is valid
  5. Navigate to AI Core > Module Configuration to assign Gemini models to specific modules

How It Works

  • The module registers a GeminiProviderService that handles communication with the Google Generative Language API
  • Chat requests are translated from the standard message format to Gemini's contents format automatically
  • Token usage (prompt and completion tokens) is reported back to AI Core for usage tracking
  • Streaming responses are supported via the streamGenerateContent endpoint
  • The module can list available models from your Google API account

Notes

  • Requires a valid Google API key with Gemini API access
  • Usage is tracked through AI Core's usage monitoring
  • The module registers as a provider addon in AI Core and seeds its models automatically
  • Supports Gemini API features including function calling, code execution, and search grounding

Changelog: View version history