Quickstart
This guide takes you from zero to a working ClawDesk gateway in under 5 minutes. By the end, you'll have an AI agent responding to messages.
1. Initialize a New Project
Create a new ClawDesk project with sensible defaults:
mkdir my-agent && cd my-agent
clawdesk-cli init
🚀 Initializing ClawDesk project...
Created clawdesk.toml
Created agents/default.toml
Created data/
Created logs/
✅ Project initialized. Edit clawdesk.toml to configure your setup.
This generates the following structure:
my-agent/
├── clawdesk.toml # Main configuration
├── agents/
│ └── default.toml # Default agent definition
├── data/ # SochDB data directory
└── logs/ # Log output
You can also pass --template minimal or --template full to clawdesk init for different starting configurations.
2. Configure Your First Provider
ClawDesk needs at least one AI provider to route messages to. Choose the option that fits your setup:
For a fully local setup with no API keys, use Ollama:
# Install Ollama (macOS)
brew install ollama
# Pull a model
ollama pull llama3.1
# Start Ollama server (if not already running)
ollama serve
Edit clawdesk.toml and add the Ollama provider:
[providers.ollama]
type = "ollama"
base_url = "http://localhost:11434"
default_model = "llama3.1"
enabled = true
Get an API key from the Anthropic Console:
[providers.anthropic]
type = "anthropic"
api_key = "${ANTHROPIC_API_KEY}"
default_model = "claude-sonnet-4-20250514"
enabled = true
Set the environment variable:
export ANTHROPIC_API_KEY="sk-ant-..."
Never commit API keys to version control. Use environment variables or a .env file (add it to .gitignore).
Get an API key from the OpenAI Platform:
[providers.openai]
type = "openai"
api_key = "${OPENAI_API_KEY}"
default_model = "gpt-4o"
enabled = true
export OPENAI_API_KEY="sk-..."
Get an API key from Google AI Studio:
[providers.google]
type = "google"
api_key = "${GOOGLE_API_KEY}"
default_model = "gemini-2.0-flash"
enabled = true
export GOOGLE_API_KEY="AI..."
import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem';
3. Start the Gateway
Launch the ClawDesk gateway server:
clawdesk-cli gateway
╔══════════════════════════════════════════════╗
║ ClawDesk Gateway v0.1.0 ║
╠══════════════════════════════════════════════╣
║ HTTP API: http://127.0.0.1:18789 ║
║ WebSocket: ws://127.0.0.1:18789/ws ║
║ Web UI: http://127.0.0.1:18789/ui ║
║ Provider: ollama (llama3.1) ║
║ Channels: 0 active ║
╚══════════════════════════════════════════════╝
INFO clawdesk_gateway > Gateway started on 127.0.0.1:18789
INFO clawdesk_gateway > SochDB storage initialized
INFO clawdesk_gateway > Agent "default" loaded
The gateway is now running and accepting requests on port 18789.
The gateway binds to 127.0.0.1 by default. To expose it on all interfaces (e.g., in Docker), set gateway.bind = "0.0.0.0" in your config.
4. Send a Test Message
Open a new terminal and send a message to your agent:
clawdesk-cli message "Hello ClawDesk! What can you do?"
🤖 Agent (default) via ollama/llama3.1:
Hello! I'm your ClawDesk AI agent. I can help you with:
- Answering questions and having conversations
- Processing messages from multiple channels (Telegram, Discord, Slack, etc.)
- Running scheduled tasks via cron
- Executing plugins and custom workflows
How can I help you today?
You can also interact via the HTTP API directly:
curl -X POST http://127.0.0.1:18789/api/v1/message \
-H "Content-Type: application/json" \
-d '{
"content": "Hello ClawDesk!",
"agent": "default"
}'
{
"id": "msg_01abc123",
"agent": "default",
"content": "Hello! I'm your ClawDesk AI agent...",
"provider": "ollama",
"model": "llama3.1",
"tokens": { "input": 12, "output": 87 },
"latency_ms": 342
}
5. Access the Web UI
ClawDesk ships with a built-in web interface. Open your browser and navigate to:
The Web UI provides:
- Chat interface — Interact with agents in real time
- Channel dashboard — Monitor connected channels
- Agent configuration — Edit agents and system prompts
- Message history — Browse past conversations (stored in SochDB)
- Plugin manager — Install, enable, and configure plugins
Putting It All Together
Here's a minimal complete clawdesk.toml for a working setup:
[gateway]
bind = "127.0.0.1"
port = 18789
log_level = "info"
[providers.ollama]
type = "ollama"
base_url = "http://localhost:11434"
default_model = "llama3.1"
enabled = true
[agents.default]
name = "default"
provider = "ollama"
system_prompt = "You are a helpful AI assistant powered by ClawDesk."
[session]
timeout_minutes = 30
max_history = 100
[storage]
engine = "sochdb"
path = "./data"
Common Issues
| Problem | Solution |
|---|---|
Port 18789 already in use | Kill the existing process: lsof -ti :18789 | xargs kill -9 |
Provider connection failed | Verify Ollama is running (ollama list) or API key is set |
SochDB not found | Ensure sochdb is a sibling directory and has been built |
Config parse error | Run clawdesk-cli config validate to see detailed errors |
Next Steps
Now that your gateway is running, dive deeper:
- Configuration → — Master the TOML config file, environment variables, and hot-reload.
- First Channel → — Connect Telegram, Discord, or Slack to your agent.
- Agent Pipeline — Create custom agents with specialized system prompts.
- Plugin System — Extend ClawDesk with custom functionality.