Gemini Provider
Google Gemini AI models integration addon for AI Core.
Features
- Google Gemini Integration - Full integration with Google's Gemini AI models via the Generative Language API
- Model Selection - Choose from Gemini 3, 2.5, and 2.0 series models
- Multi-Modal Support - Support for text, image, and document inputs
- Streaming Responses - Real-time streaming for long-form content
Requirements
| Requirement | Details |
|---|---|
| Dependencies | AICore |
| PHP Version | 8.2+ |
| Google Cloud | Valid Google API key with Gemini API access |
Installation
Enable via Admin Panel
- Log in as administrator
- Navigate to Settings > Addons
- Find Gemini Provider and click Enable
Enable via Command Line
php artisan module:enable GeminiAIProvider
php artisan migrate
note
AI Core must be installed and enabled before enabling this module.
Configuration
After enabling the module, configure the Gemini provider through AI Core > Providers. The module registers Google Gemini as an available provider with its models.
- API Key - Your Google API key with Gemini access
- Default Model - Select default Gemini model for requests
Supported Models
Gemini 3 Series (Latest)
| Model | Context | Description |
|---|---|---|
gemini-3-pro | 1M tokens | Most intelligent model with frontier intelligence |
gemini-3-flash | 1M tokens | Default model with fast performance and thinking |
gemini-3-deep-think | 1M tokens | Advanced reasoning for complex problems |
Gemini 2.5 Series (Stable)
| Model | Context | Description |
|---|---|---|
gemini-2.5-pro | 1M tokens | Superior reasoning with thinking mode |
gemini-2.5-flash | 1M tokens | Price-performance optimized |
gemini-2.5-flash-lite | 1M tokens | Most cost-efficient option |
Gemini 2.0 Series
| Model | Context | Description |
|---|---|---|
gemini-2.0-flash | 1M tokens | Fast multimodal model |
Usage
This module is a provider addon for AI Core. Once enabled, it registers Google Gemini as an available AI provider and seeds its models automatically. There is no separate UI for this module -- all management is done through AI Core.
Setup
- After enabling the module, navigate to AI Core > Providers
- Gemini will appear as an available provider
- Add or edit the Gemini provider entry and enter your Google API key
- Click Test Connection to verify your API key is valid
- Navigate to AI Core > Module Configuration to assign Gemini models to specific modules
How It Works
- The module registers a
GeminiProviderServicethat handles communication with the Google Generative Language API - Chat requests are translated from the standard message format to Gemini's
contentsformat automatically - Token usage (prompt and completion tokens) is reported back to AI Core for usage tracking
- Streaming responses are supported via the
streamGenerateContentendpoint - The module can list available models from your Google API account
Notes
- Requires a valid Google API key with Gemini API access
- Usage is tracked through AI Core's usage monitoring
- The module registers as a provider addon in AI Core and seeds its models automatically
- Supports Gemini API features including function calling, code execution, and search grounding
Changelog: View version history