Talome
Getting Started

Configuration

Configure AI providers, integrations, security modes, notifications, and system settings. Two paths — environment variables for initial setup, Settings UI for everything after.

Talome has two configuration paths that serve different purposes. Understanding when to use each one saves time and avoids confusion.

Environment variables (.env file or Docker Compose environment block) — used for initial bootstrap. These are read once at container startup. Change them by editing the file and restarting the container.

Settings UI (dashboard sidebar > Settings) — used for everything after first boot. These are persisted in SQLite and take effect immediately, no restart needed. The AI can also modify settings through the set_setting tool.

Most users set 2-3 environment variables at install time, then manage everything else through the Settings UI.


AI Provider Setup

The AI provider is the most important configuration choice. It determines the quality, speed, cost, and privacy characteristics of every conversation.

Anthropic's Claude is Talome's default provider. All 230+ tools are tested primarily against Claude models.

Setup:

  1. Create an account at console.anthropic.com
  2. Generate an API key (starts with sk-ant-)
  3. In Talome, go to Settings > AI Provider and paste the key
  4. Select your default model

Model recommendations:

ModelBest forSpeedCost
Claude HaikuRoutine tasks (install, status, list)Fast (~1-2s)~$0.01/conversation
Claude SonnetComplex reasoning (diagnostics, multi-step)Medium (~3-5s)~$0.03/conversation

A typical session with 5-10 messages costs $0.01-0.05. Even heavy daily use rarely exceeds $1/month.

Privacy: Your messages and tool results are sent to Anthropic's API for processing. No data is stored by Anthropic after processing (per their API terms). Talome stores conversations locally in SQLite.

OpenAI

Setup:

  1. Create an API key at platform.openai.com
  2. Paste it into Settings > AI Provider > OpenAI API Key
  3. Set DEFAULT_LLM_PROVIDER to openai in your environment, or select OpenAI in Settings

GPT-5.3 is the recommended model for a good balance of capability and cost. GPT-5.4 is available for heavier reasoning tasks. All Talome tools are compatible.

The codebase default for OpenAI is gpt-4o-mini, which was retired. After selecting OpenAI as your provider, go to Settings > AI Provider and set the model to gpt-5.3 or another current model.

Ollama (Local, Free, Private)

For fully local AI with zero cloud dependency:

Setup:

  1. Install Ollama: curl -fsSL https://ollama.ai/install.sh | sh
  2. Pull a model: ollama pull llama3.1 (8B parameters, ~4.7GB)
  3. In Talome Settings, set Ollama URL to http://host.docker.internal:11434
  4. Select Ollama as the default provider

Important: Use host.docker.internal (not localhost) because Talome runs inside a Docker container and needs to reach Ollama on the host machine.

Model recommendations:

ModelRAM RequiredQuality
llama3.1:8b8 GBGood for basic tasks
llama3.1:70b48 GBComparable to cloud providers
mistral:7b8 GBFast, good for simple tasks

Local models handle single-tool tasks well (install an app, check status, read logs). Multi-step operations that require chaining 5+ tools may produce inconsistent results compared to Claude or GPT-5.3. Start with Anthropic if you want the best experience, then try Ollama once you are familiar with the system.

Cost Comparison

ProviderMonthly cost (moderate use)PrivacyQuality
Anthropic$1-5API-processed, not storedBest
OpenAI$1-5API-processed, not storedVery good
Ollama$0 (electricity only)Fully localGood (depends on model)

Security Modes

Talome classifies every tool into one of three tiers based on its impact:

TierWhat it meansExamples
ReadOnly reads data, no side effectslist_containers, get_system_stats, recall
ModifyCreates or changes resourcesinstall_app, start_container, wire_apps, remember
DestructiveDeletes or irreversibly changes resourcesuninstall_app, prune_resources, apply_change

The security mode determines how the gateway handles each tier:

Permissive

All tools execute freely. No confirmations, no restrictions. The AI has full access.

Best for: Power users who trust the AI and want maximum speed. Every action happens immediately.

Cautious (Default)

Read and modify tools execute normally. Destructive tools require the AI to explicitly confirm by calling the tool a second time with confirmed: true.

In practice, this means:

  • Installing an app? Happens immediately.
  • Uninstalling an app? The AI tells you what will happen and waits for your approval.
  • Deleting Docker volumes? Blocked until you say yes.

Best for: Most users. You get the speed of AI-first management with a safety net for irreversible actions.

Locked

Only read-tier tools work. All modify and destructive tools return an error. The AI can look at everything but change nothing.

Best for: Shared environments where you want monitoring without risk, or when you want to explore Talome without accidentally modifying anything.

Changing the mode:

Go to Settings > Security and select the mode. Or ask the AI:

"Switch to permissive mode"

The AI calls set_setting with security_mode: "permissive". The change takes effect on the next tool call.


Custom System Prompt

You can append custom instructions to the AI's system prompt. This lets you shape its behavior without modifying source code.

Settings > AI Provider > Custom System Prompt

Examples:

Always respond in Spanish.
When installing apps, always set timezone to Europe/London.
Never use ports in the 9000 range — they conflict with my other services.
You are managing a home lab for a family of four. Prioritize stability
over cutting-edge features. Always suggest backups before making changes.

Custom instructions are injected after Talome's core prompt and are clearly marked as user-supplied. They cannot override safety rules (security modes, audit logging, destructive confirmation).


Notification Channels

Talome can send notifications through multiple channels. Automations, health alerts, and download completions can all trigger notifications.

Telegram

  1. Create a bot via @BotFather and copy the token
  2. Get your chat ID by messaging @userinfobot
  3. In Settings > Notifications, add a Telegram channel with the bot token and chat ID
  4. Click Test to send a test message

Discord

  1. Create a webhook in your Discord server (Server Settings > Integrations > Webhooks)
  2. Copy the webhook URL
  3. Add a Discord channel in Settings with the webhook URL
  4. Click Test

Webhook (Generic)

Send notifications to any HTTP endpoint. Configure the URL, method (POST/PUT), headers, and body template. Useful for connecting to Slack, Ntfy, Gotify, Matrix, or custom services.

Browser Push

Enable browser push notifications in Settings. The dashboard will request notification permission from your browser. Works without any external service.

Testing Channels

Every channel has a Test button that sends a sample notification immediately. If it does not arrive, the Settings page shows the error response from the target service.


Integration Configuration

Each external app integration follows the same pattern:

  1. Install the app (via the app store or Docker Compose)
  2. Get the app's URL and API key
  3. Enter them in Settings > Integrations
  4. Talome instantly activates the corresponding tools — no restart needed

Finding API Keys

AppWhere to find the API key
SonarrSettings > General > API Key
RadarrSettings > General > API Key
ProwlarrSettings > General > API Key
JellyfinDashboard > API Keys > Create
PlexAccount > Authorized Devices (or use X-Plex-Token)
qBittorrentNo API key — uses username/password authentication
OverseerrSettings > General > API Key
Home AssistantProfile > Long-Lived Access Tokens > Create
Pi-holeSettings > API > Show API token
AudiobookshelfSettings > Users > click user > Token
VaultwardenAdmin panel > General Settings > Admin Token

Auto-Activation

When you save an integration URL and API key, the tool registry detects the new settings and activates the domain's tools. The AI's next message will have access to those tools without any restart or reload.

For example, saving sonarr_url and sonarr_api_key activates 27 arr tools + 5 media tools + 9 optimization tools — 41 new capabilities from two settings.

Verifying an Integration

After saving, ask the AI:

"Check if Sonarr is connected"

It will call arr_get_status and report the connection status, version, and any issues.


Data Storage

All Talome state lives in a single SQLite database. By default it is stored at /app/data/talome.db inside the container, which maps to the talome-data Docker volume.

What is stored in SQLite

TableContents
usersAdmin account and password hash
conversationsChat history with the AI
memoriesUser preferences, facts, corrections, context
automationsAutomation definitions and schedules
automation_runsExecution history for automations
widgetsDashboard widget layout and configuration
settingsAll runtime settings (API keys, integration URLs, preferences)
auditAudit trail of every tool execution
appsInstallation records for managed apps
changesSelf-improvement change history
issuesTracked issues and feature requests
notificationsNotification history
notification_channelsConfigured notification channels

Backup

The simplest backup is copying the database file:

docker cp talome:/app/data/talome.db ./talome-backup.db

Or ask the AI:

"Back up the database"

For automated backups, create an automation:

"Create an automation that backs up the database every night at 2am
 and keeps the last 7 copies"

Environment Variable Reference

These variables are read at container startup. Most are optional — the installer sets sensible defaults.

VariableDefaultDescription
TALOME_SECRET(required)64-char hex string for session encryption
NODE_ENVproductionRuntime environment
CORE_PORT4000Hono API server port
CORE_HOST0.0.0.0API server bind address
DOCKER_SOCKET/var/run/docker.sockPath to Docker socket
DATABASE_PATH./data/talome.dbSQLite database location
ANTHROPIC_API_KEYAnthropic API key (can also set in UI)
OPENAI_API_KEYOpenAI API key (can also set in UI)
OLLAMA_BASE_URLOllama server URL (can also set in UI)
DEFAULT_LLM_PROVIDERanthropicAI provider: anthropic, openai, ollama
DEFAULT_MODELclaude-sonnet-4-20250514Default model identifier
NEXT_PUBLIC_CORE_URLhttp://localhost:4000API URL for the dashboard
NEXT_PUBLIC_APP_NAMETalomeDisplay name in the dashboard header
LOG_LEVELinfoBackend log level: debug, info, warn, error

API keys set via environment variables are used as fallbacks. Keys set in the Settings UI take precedence. If you set both, the UI value wins.


Network Configuration

Binding Address

By default, Talome binds to 0.0.0.0, making it accessible from any device on your network. To restrict to localhost only:

environment:
  - CORE_HOST=127.0.0.1

Running Behind a Reverse Proxy

If you run Talome behind nginx, Caddy, or Traefik, set the public URL so the dashboard knows where to find the API:

environment:
  - NEXT_PUBLIC_CORE_URL=https://talome.example.com/api

Example nginx configuration:

server {
    listen 443 ssl;
    server_name talome.example.com;

    location / {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
    }

    location /api/ {
        proxy_pass http://localhost:4000/;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;
    }
}

The WebSocket Upgrade headers are required for real-time streaming of AI responses and container stats. Without them, the chat will fall back to polling, which is slower.

Talome's Built-In Reverse Proxy

Talome includes a Caddy-based reverse proxy that can expose your apps with automatic TLS. No external proxy needed:

"Set up a reverse proxy route for Jellyfin at media.example.com"

The AI calls proxy_add_route and proxy_configure_tls to set up the route with Let's Encrypt certificates. See the Networking guide for details.

Local DNS with mDNS

For LAN access without a domain, Talome can set up local DNS via CoreDNS:

"Enable mDNS so I can access apps at *.talome.local"

Apps become reachable at jellyfin.talome.local, sonarr.talome.local, etc., with self-signed HTTPS certificates. No public domain or port forwarding needed.


Next Steps

On this page