Why Integrate with OpenClaw?
OpenClaw has built-in web search via the web_search tool. By default, it uses Brave Search API — which is fine, but:
- Costs money — $0.003 per query adds up
- Logs queries — Even "privacy-focused" Brave logs API usage
- External dependency — If Brave is down, your agent can't search
SearXNG gives you free, private, self-hosted search that works even when the internet is being weird.
Step 1: Deploy SearXNG
If you don't already have SearXNG running, spin it up:
# Create directory
mkdir -p ~/searxng && cd ~/searxng
# Create docker-compose.yml
cat > docker-compose.yml << 'EOF'
services:
searxng:
image: searxng/searxng:latest
container_name: searxng
ports:
- "8080:8080"
environment:
- SEARXNG_BASE_URL=http://localhost:8080
volumes:
- ./config:/etc/searxng:rw
restart: unless-stopped
EOF
# Create config directory
mkdir -p config
# Start it
docker-compose up -dVerify it's running:
curl "http://localhost:8080/search?q=hello&format=json" | jq '.results[0].title'Step 2: Enable JSON API
SearXNG needs JSON output enabled for agent integration:
use_default_settings: true
search:
formats:
- html
- json # This is what agents need
server:
secret_key: "$(openssl rand -hex 32)"
limiter: false # Disable rate limiting for local useRestart after config changes:
docker-compose restart searxngStep 3: Create the OpenClaw Skill
OpenClaw uses skills (plugins) to extend functionality. Create a SearXNG skill:
# Create skill directory
mkdir -p ~/.openclaw/skills/searxngCreate the skill file:
# SearXNG Search Skill
Self-hosted private search via SearXNG.
## Usage
```bash
# Basic search
~/.openclaw/skills/searxng/search.sh "your query"
# With result limit
~/.openclaw/skills/searxng/search.sh "your query" 10
```
## Configuration
Set `SEARXNG_URL` environment variable to override default:
```bash
export SEARXNG_URL="http://localhost:8080"
```
## When to Use
Use this skill for web searches when privacy is important or
when you want to avoid Brave API costs.Create the search script:
#!/bin/bash
# SearXNG search skill for OpenClaw
# Usage: ./search.sh "query" [limit]
set -e
QUERY="$1"
LIMIT=${2:-5}
SEARXNG_URL=${SEARXNG_URL:-"http://localhost:8080"}
if [ -z "$QUERY" ]; then
echo "Usage: $0 <query> [limit]"
echo "Example: $0 'kubernetes best practices' 5"
exit 1
fi
# Query SearXNG
RESULTS=$(curl -sS "$SEARXNG_URL/search" \
-G --data-urlencode "q=$QUERY" \
--data "format=json" 2>/dev/null)
# Check if we got results
if [ -z "$RESULTS" ] || [ "$RESULTS" = "null" ]; then
echo "No results or SearXNG unavailable"
exit 1
fi
# Format output for agent consumption
echo "$RESULTS" | jq -r ".results[:$LIMIT][] | \"
## \(.title)
URL: \(.url)
\(.content // \"No snippet\")
---
\""Make it executable:
chmod +x ~/.openclaw/skills/searxng/search.shStep 4: Test the Skill
~/.openclaw/skills/searxng/search.sh "OpenClaw AI agent"
# Expected output:
## OpenClaw - AI Agent Framework
URL: https://openclaw.ai/
The open-source framework for building AI agents...
---
## GitHub - openclaw/openclaw
URL: https://github.com/openclaw/openclaw
OpenClaw is a CLI and gateway for AI agents...
---Step 5: Tell Your Agent
Add to your AGENTS.md or agent instructions so it knows to use the skill:
## Search
For web searches, prefer the SearXNG skill for privacy:
```bash
~/.openclaw/skills/searxng/search.sh "query"
```
Fall back to `web_search` tool only if SearXNG is unavailable.Optional: Fallback Configuration
For production, you might want a wrapper that tries SearXNG first, then falls back to Brave:
#!/bin/bash
# Search with fallback: SearXNG → Brave
QUERY="$1"
LIMIT=${2:-5}
# Try SearXNG first
RESULT=$(~/.openclaw/skills/searxng/search.sh "$QUERY" "$LIMIT" 2>/dev/null)
if [ -n "$RESULT" ] && [ "$RESULT" != "No results or SearXNG unavailable" ]; then
echo "# Results via SearXNG (private)"
echo "$RESULT"
else
echo "# SearXNG unavailable, falling back to Brave..."
# Let OpenClaw's built-in web_search handle it
echo "FALLBACK_TO_BRAVE: $QUERY"
fiHow It Works
The flow is simple: your agent calls the SearXNG skill, which queries your local SearXNG instance. SearXNG then fetches results from multiple search engines (Google, Bing, DuckDuckGo) and returns them — but the search engines only see your server's IP, not your agent's queries.
If SearXNG is unavailable, the fallback script routes to Brave Search (or your configured backup). Your agent always gets results; the privacy layer is transparent.
Why This Matters
- Complete privacy — Your queries never leave your server. No search engine sees what your agents are researching. No profiling, no data harvesting.
- Zero cost — No API fees, no per-query charges. Run as many searches as you need.
- Full control — You choose which engines to query, what gets cached, and how long results are stored.
- Fast local responses — Sub-50ms from your own server, no round-trips to third-party APIs.
Troubleshooting
"Connection refused"
SearXNG isn't running. Check: docker ps | grep searxng
"format=json not supported"
Add formats: [html, json] to settings.yml and restart
No results for some queries
Some engines rate-limit. SearXNG rotates through engines automatically. Try again or add more engines in settings.
Next Steps
- Add more engines — Configure DuckDuckGo, Qwant, Mojeek in settings.yml
- Enable caching — Reduce repeated queries with Redis
- Deploy on VPS — Run SearXNG on a VPS for multi-agent access
At 48nauts, every agent uses SearXNG for search. Our queries stay local, our costs stay zero, and we control our own infrastructure.
