Use Cara with VS Code, JetBrains, chat UIs, and scripts
Outcome
Run Cara locally with OpenAI-style endpoints enabled, then connect:
- VS Code / JetBrains (via Continue)
- Open WebUI
- LibreChat
- Scripts and automations (
curl)
Prerequisites
carainstalled.ANTHROPIC_API_KEYorOPENAI_API_KEY.- Optional client app: Continue, Open WebUI, or LibreChat.
1) Create config
Export a gateway token:
export CARAPACE_GATEWAY_TOKEN="$(openssl rand -hex 32)"Windows (PowerShell) alternative:
$bytes = [byte[]]::new(32)
[System.Security.Cryptography.RandomNumberGenerator]::Fill($bytes)
$env:CARAPACE_GATEWAY_TOKEN = [System.BitConverter]::ToString($bytes).Replace('-', '').ToLower(){
"gateway": {
"bind": "loopback",
"port": 18789,
"auth": {
"mode": "token",
"token": "${CARAPACE_GATEWAY_TOKEN}"
},
"openai": {
"chatCompletions": true,
"responses": true
}
},
"anthropic": {
"apiKey": "${ANTHROPIC_API_KEY}"
}
}
2) Start Cara
CARAPACE_CONFIG_PATH=./carapace.json5 cara3) Connect your app
Use these values in your app's model/provider settings:
- Provider type:
OpenAI/OpenAI-compatible - Base URL:
http://127.0.0.1:18789/v1 - API key/token: same value as
CARAPACE_GATEWAY_TOKEN - Model:
carapace(this is the fixed model name Cara exposes; the actual backend model is determined by your provider config)
VS Code / JetBrains (Continue)
In Continue model/provider settings, point the model to the values above.
Open WebUI / LibreChat
Add a custom OpenAI-compatible connection and use the same values above. Field names vary by UI, but the required pieces are always: base URL, API key, and model.
4) Script smoke test
(curl)
Chat Completions:
curl -sS \
-H "Authorization: Bearer ${CARAPACE_GATEWAY_TOKEN}" \
-H "Content-Type: application/json" \
-X POST http://127.0.0.1:18789/v1/chat/completions \
-d '{
"model": "carapace",
"messages": [
{"role":"user","content":"Say hello in one sentence."}
]
}'Responses:
curl -sS \
-H "Authorization: Bearer ${CARAPACE_GATEWAY_TOKEN}" \
-H "Content-Type: application/json" \
-X POST http://127.0.0.1:18789/v1/responses \
-d '{
"model": "carapace",
"input": "List three secure defaults for a local AI assistant."
}'5) Verify
- Your client app returns assistant output through Cara.
curlcalls return200JSON responses with assistant output.- Unauthorized calls return
401(auth enforcement working).
Next step
Common failures and fixes
- Symptom:
404on/v1/chat/completionsor/v1/responses.- Fix: Ensure
gateway.openai.chatCompletions/gateway.openai.responsesare enabled.
- Fix: Ensure
- Symptom: app says endpoint not found.
- Fix: Confirm base URL is
http://127.0.0.1:18789/v1(include/v1).
- Fix: Confirm base URL is
- Symptom:
401 Unauthorized.- Fix: Confirm bearer token matches
gateway.auth.token.
- Fix: Confirm bearer token matches
- Symptom:
500provider error.- Fix: Verify provider key/model config and check logs with
cara logs -n 200.
- Fix: Verify provider key/model config and check logs with
- Symptom: app on another device cannot connect.
- Fix: This recipe is local-only (
127.0.0.1). If you need remote access, use a LAN/Tailnet setup with TLS and auth hardening.
- Fix: This recipe is local-only (