Skip to main content

Documentation Index

Fetch the complete documentation index at: https://agents.craft.do/docs/llms.txt

Use this file to discover all available pages before exploring further.

LLM Connections let you add multiple AI provider configurations and switch between them. Each session locks to a specific connection after the first message, and workspaces can define their own default connection.

Location

LLM connections are stored in:
~/.craft-agent/config.json

How Connections Are Used

Connections resolve in this order:
  1. Session connection (locked after first message)
  2. Workspace default connection (defaults.defaultLlmConnection)
  3. Global default connection (defaultLlmConnection)
  4. First connection in the list (fallback)
Each session locks to a connection after the first message. To change connections, start a new session.

Connection Schema

{
  "slug": "anthropic-api",
  "name": "Anthropic (API Key)",
  "providerType": "anthropic",
  "authType": "api_key",
  "baseUrl": "https://api.anthropic.com",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}

Fields

FieldRequiredDescription
slugYesURL-safe identifier (e.g., anthropic-api, codex)
nameYesDisplay name shown in the UI
providerTypeYesProvider backend (see list below)
authTypeYesAuth mechanism (see list below)
baseUrlNoCustom base URL for compatible providers
modelsNoExplicit model list. Accepts strings ("gpt-5.4") or objects with optional contextWindow and supportsImages overrides (see Custom Endpoint Capabilities).
customEndpointNoCustom endpoint protocol config. Use api to select the wire format and optional supportsImages to opt an entire endpoint into image input.
midStreamBehaviorNoHow mid-stream user sends are handled ("steer" or "queue"). Defaults to "queue" for anthropic, "steer" for Pi-backed providers. See Mid-stream behavior.
defaultModelNoDefault model for this connection
codexPathNoPath to Codex binary (OpenAI/Codex only)
awsRegionNoAWS region for Bedrock
gcpProjectIdNoGCP project for Vertex
gcpRegionNoGCP region for Vertex
createdAtYesTimestamp (ms) when created
lastUsedAtNoTimestamp (ms) when last used

providerType Values

ValueDescription
anthropicDirect Anthropic API
anthropic_compatAnthropic‑compatible endpoints (OpenRouter, Vercel AI Gateway, custom)
openaiOpenAI via Codex app‑server
openai_compatOpenAI‑compatible endpoints
bedrockAWS Bedrock
vertexGoogle Vertex AI

authType Values

ValueDescription
api_keyAPI key only
api_key_with_endpointAPI key + custom endpoint
oauthOAuth login (Claude Max / Codex / OpenAI)
iam_credentialsAWS IAM credentials (Bedrock)
service_account_fileGCP service account JSON (Vertex)
environmentUses environment variables
noneNo auth required

Examples

Anthropic (API Key)

{
  "slug": "anthropic-api",
  "name": "Anthropic (API Key)",
  "providerType": "anthropic",
  "authType": "api_key",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}

Claude Max (OAuth)

{
  "slug": "claude-max",
  "name": "Claude Max",
  "providerType": "anthropic",
  "authType": "oauth",
  "defaultModel": "claude-opus-4-7",
  "createdAt": 1737451800000
}

OpenRouter (Anthropic‑compatible)

{
  "slug": "openrouter",
  "name": "OpenRouter",
  "providerType": "anthropic_compat",
  "authType": "api_key_with_endpoint",
  "baseUrl": "https://openrouter.ai/api",
  "models": [
    "anthropic/claude-opus-4.7",
    "anthropic/claude-haiku-4.5"
  ],
  "defaultModel": "anthropic/claude-opus-4.7",
  "createdAt": 1737451800000
}

OpenRouter (OpenAI‑compatible)

{
  "slug": "openrouter-openai",
  "name": "OpenRouter (OpenAI‑compat)",
  "providerType": "openai_compat",
  "authType": "api_key_with_endpoint",
  "baseUrl": "https://openrouter.ai/api/v1",
  "models": [
    "openai/gpt-5.2-codex",
    "openai/gpt-5.1-codex-mini"
  ],
  "defaultModel": "openai/gpt-5.2-codex",
  "createdAt": 1737451800000
}

AWS Bedrock (IAM Credentials)

{
  "slug": "bedrock",
  "name": "AWS Bedrock",
  "providerType": "bedrock",
  "authType": "iam_credentials",
  "awsRegion": "us-east-1",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}
With iam_credentials, your AWS Access Key ID, Secret Access Key, and optional Session Token are stored securely and injected into the subprocess environment at runtime. Use this when you want to configure credentials directly in the UI.

AWS Bedrock (Environment)

{
  "slug": "bedrock",
  "name": "AWS Bedrock",
  "providerType": "bedrock",
  "authType": "environment",
  "awsRegion": "us-east-1",
  "defaultModel": "claude-sonnet-4-6",
  "createdAt": 1737451800000
}
With environment, the subprocess inherits your shell’s AWS credential chain — ~/.aws/credentials, AWS_PROFILE, IAM roles, SSO sessions, and environment variables all work. No credentials are stored in Craft Agents.
To set up Bedrock in the UI: Settings → AI → Add Connection → I use other provider → Amazon Bedrock. You can choose between IAM Credentials and Environment (AWS CLI) authentication.
Set awsRegion to the region where you have Bedrock model access enabled (e.g., us-east-1, us-west-2, eu-west-1).

Codex / OpenAI (OAuth)

{
  "slug": "codex",
  "name": "OpenAI (Codex)",
  "providerType": "openai",
  "authType": "oauth",
  "codexPath": "/Applications/Craft Agents.app/Contents/Resources/vendor/codex/darwin-arm64/codex",
  "defaultModel": "codex-mini-latest",
  "createdAt": 1737451800000
}

Custom Endpoint Capabilities

Custom endpoints default to a 128K context window and text-only input. If your model supports a larger context window or accepts image input, set those capabilities explicitly in the connection config.
For toggling per-model image support specifically, the chat input model picker now exposes an inline image icon on each row of a custom-endpoint connection — one click flips models[i].supportsImages and persists. See API Providers → Image Input for Custom Endpoints. The JSON forms below remain useful for automation and for setting customEndpoint.supportsImages (the endpoint-wide default), which has no UI surface.
Use model objects when an endpoint hosts a mix of text-only and multimodal models:
{
  "slug": "ollama",
  "name": "Ollama",
  "providerType": "pi_compat",
  "authType": "none",
  "baseUrl": "http://localhost:11434/v1",
  "customEndpoint": { "api": "openai-completions" },
  "models": [
    { "id": "gemma4", "contextWindow": 262144, "supportsImages": true },
    { "id": "qwen3-coder", "contextWindow": 131072 }
  ],
  "defaultModel": "gemma4",
  "createdAt": 1737451800000
}

Whole-endpoint opt-in

If every model behind the endpoint is multimodal, you can opt in at the endpoint level:
{
  "customEndpoint": {
    "api": "openai-completions",
    "supportsImages": true
  }
}
Craft Agents does not auto-detect image support for arbitrary endpoints. Custom endpoints stay text-only unless you explicitly set supportsImages: true at the endpoint or model level. Plain string model entries continue to use the default 128K context window and text-only input.
For models like Gemma 4 served through Ollama, vLLM, or another OpenAI-compatible proxy, prefer the per-model form so only the vision-capable model opts into image input.

Mid-stream Behavior

When you send a message while the agent is still streaming a previous turn, Craft Agents needs to decide whether to deliver it into the in-flight turn or hold it for the next one. The midStreamBehavior field on the connection controls this.
ValueBehavior
"steer"Deliver the message into the in-flight turn (Pi’s native .steer() or Claude’s PreToolUse hook injection).
"queue"Hold the message; let the current turn finish naturally; replay as a new turn afterwards.
{
  "slug": "anthropic-api",
  "providerType": "anthropic",
  "authType": "api_key",
  "midStreamBehavior": "queue",
  "createdAt": 1737451800000
}

Defaults by backend

If midStreamBehavior is omitted (legacy connections, fresh setups), the default depends on the backend:
  • anthropic"queue". Claude’s emulated steer relies on the model invoking a tool before the turn ends. If no tool fires, the steer becomes undelivered and Craft Agents falls back to a re-queue anyway — with the original turn’s tokens already paid. Defaulting to queue avoids the wasted round-trip.
  • pi / pi_compat"steer". Pi’s native steer is non-destructive: your message is delivered after the current tool finishes, full conversation context preserved.

Changing it from the UI

Open Settings → AI, scroll to the Connections section, click the menu on the relevant connection (or right-click the row), and choose Mid-stream sends → Steer immediately or Queue until ready. The selection is saved on the connection and takes effect immediately for the next mid-stream send.
This is a per-connection setting, not per-session. Two sessions sharing the same connection will both use the same mode.
For the user-facing concept and UI walkthrough, see Interactions → Sending while the agent is responding.

Managing Connections

Connections are managed in Settings → AI:
  • Add/edit/delete connections
  • Set a global default connection
  • Validate connection status
  • Set per‑workspace defaults
If you only need a single provider, keep one connection and set it as default.