Providers Hub

Outcome

Pick a first provider with the shortest path to a verified useful outcome.

If you are undecided, do not optimize for the perfect long-term setup yet. Optimize for the shortest verified first run, then change providers later if needed.

Start with one provider only

Run cara setup --provider <provider> when you already know which provider you want, or plain cara setup if you want the wizard to ask. If you are unsure, choose local-chat as the first outcome and add channels only after cara verify --outcome auto passes. In headless or scripted environments, pass --provider; non-interactive cara setup now errors instead of writing a providerless config.

Anthropic / OpenAI API key (fastest cloud path)

Pick one of these, not both:

export ANTHROPIC_API_KEY='...'
cara setup --provider anthropic

Or:

export OPENAI_API_KEY='...'
cara setup --provider openai

Anthropic also supports a setup-token-backed auth profile:

export CARAPACE_CONFIG_PASSWORD='...'
export ANTHROPIC_SETUP_TOKEN='...'
cara setup --provider anthropic --auth-mode setup-token

Notes:

Codex (OpenAI subscription login)

export CARAPACE_CONFIG_PASSWORD='...'
export OPENAI_OAUTH_CLIENT_ID='...'
export OPENAI_OAUTH_CLIENT_SECRET='...'
cara setup --provider codex

Notes:

Ollama (fastest fully local path)

export OLLAMA_BASE_URL='http://127.0.0.1:11434'
cara setup --provider ollama

If your Ollama endpoint requires auth, the wizard will also offer an optional API key prompt and can write providers.ollama.apiKey from either direct input or ${OLLAMA_API_KEY}.

Vertex AI

Vertex AI supports Google Gemini models and third-party models from Anthropic, Meta, Mistral, and Nvidia. Authentication uses gcloud CLI credentials or the GCE metadata server.

Prerequisite: authenticate with gcloud auth application-default login so Carapace can obtain access tokens.

export VERTEX_PROJECT_ID='my-gcp-project'
export VERTEX_LOCATION='us-central1'   # optional, defaults to us-central1
cara setup --provider vertex

Gemini models use the short form in agent config:

// agents.defaults.model or agents.list[].model
{ "model": "vertex:gemini-2.5-flash" }

Third-party models use the full publisher path from the Vertex AI Model Garden. You must enable the model's API in your GCP project first.

// agents.defaults.model or agents.list[].model
vertex:publishers/anthropic/models/claude-sonnet-4-20250514
vertex:publishers/meta/models/llama-3.1-405b-instruct-maas
vertex:publishers/mistral/models/mistral-large-2411
vertex:publishers/nvidia/models/llama-3.1-nemotron-70b-instruct

Gemini / Bedrock / Venice

These providers are supported directly by the setup wizard now. If multiple provider env vars are already set, prefer the explicit provider flag so setup does not rely on the interactive default.

export GOOGLE_API_KEY='...'
cara setup --provider gemini --auth-mode api-key

Gemini also supports Google sign-in:

export GOOGLE_OAUTH_CLIENT_ID='...'
export GOOGLE_OAUTH_CLIENT_SECRET='...'
cara setup --provider gemini --auth-mode oauth

Notes:

export AWS_REGION='us-east-1'
export AWS_ACCESS_KEY_ID='...'
export AWS_SECRET_ACCESS_KEY='...'
cara setup --provider bedrock
export VENICE_API_KEY='...'
cara setup --provider venice

If GOOGLE_API_KEY is only for other Google APIs and not for Gemini, unset it before running cara setup. If you need to override the default Gemini or Venice endpoint, the wizard will offer an optional base URL override.

Supported env vars:

Provider Routing

Carapace automatically routes your requests to the correct AI provider based on the model string configured in your agent (see agent.model).

Here is an example carapace.json5 snippet locking agents onto specific providers using prefixes:

{
  "agents": {
    "list": [
      {
        "id": "researcher",
        "model": "vertex:gemini-2.5-flash",
        "system": "You are a specialized research assistant."
      },
      {
        "id": "local-coder",
        "model": "ollama:qwen2.5-coder",
        "system": "You are a local coding assistant."
      }
    ]
  }
}

Common first-run mistakes

When in doubt:

  1. choose one provider
  2. choose local-chat
  3. run cara setup --provider <provider>
  4. start cara
  5. run cara verify --outcome auto --port 18789

Capability matrix

Use the full support matrix for channels/providers/platforms:

Need help choosing?

Next paths