Providers Hub
Outcome
Pick a first provider with the shortest path to a verified useful outcome.
Recommended first choices
- Fastest cloud start: Anthropic or OpenAI
- One API key, common setup path, best fit if your immediate goal is "get to first reply quickly."
- Subscription-login path: Codex
- Best fit if you want OpenAI subscription-backed usage separated cleanly from API-key OpenAI.
- Fastest local-only start: Ollama
- Best fit if your immediate goal is "keep everything local and verify the basic loop first."
- Existing cloud standardization: Gemini or Bedrock
- Good if your environment is already centered on Google Cloud or AWS.
- OpenAI-compatible alternative path: Venice
- Good if you specifically want Venice's endpoint and API shape.
If you are undecided, do not optimize for the perfect long-term setup yet. Optimize for the shortest verified first run, then change providers later if needed.
Start with one provider only
Run cara setup --provider <provider> when you
already know which provider you want, or plain cara setup
if you want the wizard to ask. If you are unsure, choose
local-chat as the first outcome and add channels only after
cara verify --outcome auto passes. In headless or scripted
environments, pass --provider; non-interactive
cara setup now errors instead of writing a providerless
config.
Anthropic / OpenAI API key (fastest cloud path)
Pick one of these, not both:
export ANTHROPIC_API_KEY='...'
cara setup --provider anthropicOr:
export OPENAI_API_KEY='...'
cara setup --provider openaiCodex (OpenAI subscription login)
export CARAPACE_CONFIG_PASSWORD='...'
export OPENAI_OAUTH_CLIENT_ID='...'
export OPENAI_OAUTH_CLIENT_SECRET='...'
cara setup --provider codexNotes:
- Codex is separate from API-key
openai. - Codex sign-in is interactive-only in the CLI because it completes through a loopback callback on a local port.
- Control UI also supports Codex sign-in.
- The resulting config uses
codex.authProfileand defaults the agent model tocodex:default. CARAPACE_CONFIG_PASSWORDis required so the stored auth profile is encrypted at rest.
Ollama (fastest fully local path)
export OLLAMA_BASE_URL='http://127.0.0.1:11434'
cara setup --provider ollamaIf your Ollama endpoint requires auth, the wizard will also offer an
optional API key prompt and can write
providers.ollama.apiKey from either direct input or
${OLLAMA_API_KEY}.
Gemini / Bedrock / Venice
These providers are supported directly by the setup wizard now. If multiple provider env vars are already set, prefer the explicit provider flag so setup does not rely on the interactive default.
export GOOGLE_API_KEY='...'
cara setup --provider gemini --auth-mode api-keyGemini also supports Google sign-in:
export GOOGLE_OAUTH_CLIENT_ID='...'
export GOOGLE_OAUTH_CLIENT_SECRET='...'
cara setup --provider gemini --auth-mode oauthNotes:
- Gemini OAuth is interactive-only in the CLI because it completes through a loopback callback on a local port.
- Control UI also supports Gemini onboarding with either Google sign-in or API key.
- Gemini onboarding stores the Google OAuth client secret with the
auth profile; it is not written into
config.json5. - Gemini Google sign-in requires
CARAPACE_CONFIG_PASSWORDso the stored auth profile is encrypted at rest.
export AWS_REGION='us-east-1'
export AWS_ACCESS_KEY_ID='...'
export AWS_SECRET_ACCESS_KEY='...'
cara setup --provider bedrockexport VENICE_API_KEY='...'
cara setup --provider veniceIf GOOGLE_API_KEY is only for other Google APIs and not
for Gemini, unset it before running cara setup. If you need
to override the default Gemini or Venice endpoint, the wizard will offer
an optional base URL override.
Supported env vars:
ANTHROPIC_API_KEYOPENAI_API_KEYOPENAI_OAUTH_CLIENT_ID/OPENAI_OAUTH_CLIENT_SECRET(Codex OpenAI sign-in)GOOGLE_API_KEYGOOGLE_API_BASE_URL(Gemini override)GOOGLE_OAUTH_CLIENT_ID/GOOGLE_OAUTH_CLIENT_SECRET(Gemini Google sign-in)OLLAMA_API_KEY(optional Ollama auth)OLLAMA_BASE_URL(if non-default)AWS_REGIONorAWS_DEFAULT_REGION,AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY(Bedrock)AWS_SESSION_TOKEN(optional Bedrock session token)VENICE_API_KEYVENICE_BASE_URL(Venice override)
Provider Routing
Carapace automatically routes your requests to the correct AI
provider based on the model string configured in your agent
(see agent.model).
- Explicit Provider Prefixes: You can force routing
to a specific provider by using the explicit forms that provider
currently supports. Examples:
vertex:gemini-2.5-proandvertex/gemini-2.5-flash,bedrock:anthropic.claude-3-sonnetandbedrock/anthropic.claude-3-sonnet,ollama:llama3andollama/llama3,codex:gpt-5.4andcodex/gpt-5.4,venice:llama-3.3-70b, plus Gemini-specific forms likegemini/gemini-2.5-flashandmodels/gemini-2.5-flash. OpenAI and Anthropic currently do not have explicit provider prefixes; they route via bare model-name matching and fallback behavior. - Implicit Fallbacks: If you don't use a prefix,
Carapace maps the bare model name:
gemini-*routes to Gemini (or Vertex AI if Gemini is not configured).gpt-*,o1,o1-*,o3,o3-*, andchatgpt-*route to OpenAI.anthropic.claude-*,amazon.titan-*,meta.llama*route to Bedrock.claude-*and all other unrecognized models default to Anthropic.- If the exact system default model
(
claude-sonnet-4-20250514) is requested and Anthropic is not configured, it will fall back to Vertex AI (vertex:default).
Here is an example carapace.json5 snippet locking agents
onto specific providers using prefixes:
{
"agents": {
"list": [
{
"id": "researcher",
"model": "vertex:gemini-2.5-flash",
"system": "You are a specialized research assistant."
},
{
"id": "local-coder",
"model": "ollama:qwen2.5-coder",
"system": "You are a local coding assistant."
}
]
}
}
Common first-run mistakes
- Multiple provider env vars are set, but you are not sure which one the setup path should use.
- The API key is exported in a different shell than the one running
cara setuporcara. - You start with a remote channel before local chat and provider verification work.
- You try to solve network exposure, channels, and provider choice all at once.
When in doubt:
- choose one provider
- choose
local-chat - run
cara setup --provider <provider> - start
cara - run
cara verify --outcome auto --port 18789
Capability matrix
Use the full support matrix for channels/providers/platforms: