Documentation / Fleet

AI Providers

Connect Fleet to Claude, OpenAI, Gemini, and other AI services.

Overview

Fleet connects to multiple AI providers. Bring your own API keys or run models locally.


Cloud Providers (API Key)

GLM Coding Plan (Recommended)

The GLM Coding Plan from z.ai is optimized for coding tools like Fleet, Claude Code, and Cline.

  • Setup: Subscribe at z.ai
  • Models: Claude, GPT-4, and other top coding models
  • Best for: Coding tools, AI-assisted development
  • Cost: Starting at $3/month

Anthropic API

Direct API access to Claude models.

  • Setup: Get API key from console.anthropic.com
  • Models: Claude Opus 4.5, Sonnet 4.5, Haiku 4.5
  • Best for: High-volume usage, enterprise integrations
  • Cost: Pay-per-token

OpenAI

Access to GPT models.

  • Setup: Get API key from platform.openai.com
  • Models: GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo
  • Best for: Fast responses, function calling
  • Cost: Pay-per-token

Google Gemini

Access to Google's Gemini models.

  • Setup: Get API key from aistudio.google.com
  • Models: Gemini Pro, Gemini Flash
  • Best for: Multimodal tasks, long context
  • Cost: Pay-per-token (generous free tier)

OpenRouter

Access to 100+ models from multiple providers through a single API.

  • Setup: Get API key from openrouter.ai
  • Models: Claude, GPT-4, Llama, Mistral, and many more
  • Best for: Experimenting with different models, comparing outputs
  • Cost: Pay-per-token (varies by model)

Local Providers (Free)

Run models locally on your machine. No API keys, no costs, complete privacy.

Note: Local providers are for advanced users. Running capable AI models locally requires significant system resources (16GB+ RAM, modern GPU recommended).

Ollama

Easy local model management with a simple CLI.

  • Setup: Install from ollama.ai, then ollama pull llama3
  • Models: Llama 3, Mistral, CodeLlama, Phi, and more
  • Best for: Privacy, offline use, no API costs
  • Requirements: Mac with 8GB+ RAM (16GB recommended)

LM Studio

Desktop app for running local models with a friendly UI.

  • Setup: Install from lmstudio.ai
  • Models: Download models through the app
  • Best for: Easy local model management, model comparison
  • Requirements: Mac with 8GB+ RAM

Custom OpenAI-Compatible

Connect to any OpenAI-compatible API endpoint.

  • Setup: Configure base URL and API key in Settings
  • Use for: Self-hosted models, enterprise deployments, other compatible services

Setting Up a Provider

  1. Open Settings (Cmd + ,)
  2. Go to the General tab
  3. Click the provider you want to configure
  4. Enter your API key
  5. Toggle Enable

Choosing a Provider

Use Case Recommended Provider
Getting started GLM Coding Plan
Coding tools GLM Coding Plan
Claude models GLM Coding Plan or Anthropic API
OpenAI models OpenRouter or OpenAI API
Heavy usage OpenRouter
Privacy/offline Ollama or LM Studio
Experimenting OpenRouter (access to many models)
Budget-conscious GLM Coding Plan ($3/month) or Gemini (free tier)