Settings

Validation warnings (settings were saved)

General

is not installed.

This backend requires an external tool:

Documentation

Also requires an external CLI:

API Key Required

Uses Claude Code CLI with Anthropic API. No API key needed if logged into Claude Code. Runs via Ollama's Anthropic Messages API compatibility (Claude Code v0.14.0+). Use any model on OpenRouter with full CLI tool access via their Anthropic-compatible API. Connect to any Anthropic-compatible endpoint (LiteLLM, vLLM, etc.). Route through LiteLLM proxy for HuggingFace, vLLM, Together AI, Groq, and 100+ providers.
Run ollama list to see available models.
Get your key at openrouter.ai/keys
Browse models at openrouter.ai/models
Use provider/model format (e.g. huggingface/meta-llama/Llama-3-70b)
Set to match your model's limit (e.g. 8192 for DeepSeek). 0 = provider default.
Leave empty to let Claude Code auto-select the best model
Max agent tool-use loops per query (default: 25)
Uses Ollama's OpenAI-compatible API. No API key needed. Use any model on OpenRouter. Free and paid models available. Connect to any OpenAI-compatible endpoint (NVIDIA NIM, vLLM, etc.). Route through LiteLLM proxy for HuggingFace, vLLM, Together AI, Groq, and 100+ providers.
Get your key at openrouter.ai/keys
Browse models at openrouter.ai/models
Use provider/model format (e.g. huggingface/meta-llama/Llama-3-70b)
Set to match your model's limit (e.g. 8192 for DeepSeek). 0 = provider default.

Requires Google ADK. Install with: pip install 'pocketpaw[google-adk]'

Route through LiteLLM proxy. Use provider/model format (e.g. openai/gpt-4o).
Use provider/model format (e.g. huggingface/meta-llama/Llama-3-70b)
Set to match your model's limit (e.g. 8192 for DeepSeek). 0 = provider default.

Requires Codex CLI installed. Install with: npm install -g @openai/codex

Requires copilot CLI + pip install github-copilot-sdk

Requires pip install deepagents. Uses LangChain's init_chat_model for multi-provider support.

Format: provider:model. Examples: anthropic:claude-sonnet-4-6, openai:gpt-4o, ollama:llama3.2, google_genai:gemini-2.0-flash, litellm:anthropic/claude-sonnet-4-6
Use provider/model format after litellm: (e.g. litellm:huggingface/meta-llama/Llama-3-70b)

Requires OpenCode server running. Start with: opencode --server

API Keys

Keys are encrypted at rest. Only keys needed for your current backend and provider are shown.

Required for Current Backend

Get Key Saved
Required for Claude SDK backend — get a key at console.anthropic.com
Get Key Saved
Get Key Saved
Used for Gemini LLM and image generation. Free at AI Studio.


Tools & Services

Search

Get Key Saved
Get Key Saved
Saved
Used by both web search (parallel provider) and URL extract

Voice & Media

Get Key Saved
Get Key Saved
Indian language TTS (Bulbul), STT (Saaras), OCR (Vision), Translation (Mayura)

Google OAuth

Get Key Saved
Saved

Spotify

Get Key Saved
Saved

Behavior & Safety

Comma-separated list of tools that require approval

Safety

Action:

Memory

LLM used for fact extraction and memory consolidation

Search & Services


Voice

System

Web Server

Set host to 0.0.0.0 to allow access from other devices on your network

Click Restart Server after changing host or port to apply
Cron expression (default: 3 AM daily)
Security Audit

Run 7 security checks on your configuration

/ checks passed
Self-Audit Reports

Health and security check history

No reports yet. Click "Run Now" or enable the daemon.
Report: / passed
Name advertised in the Agent Card to other agents
Maximum time to wait for an agent response before failing the task
One URL per line. Only these agents can be used for outbound delegation (SSRF protection)

Soul Protocol

Persistent AI identity with psychology-informed memory, OCEAN personality, and emotional state.

e.g. "The Coding Expert", "The Compassionate Creator"
How often to auto-save soul state (0 = disabled, default 300 = 5 minutes)
Import a .soul, .yaml, or .json file. Export saves the current soul to disk.