Can Sub-Agents Access the Internet? Debunking the "Hard-Sandboxed" Myth
A common misconception floating around Discord: sub-agents are "hard-sandboxed" and cannot make HTTP requests, fetch URLs, or access the internet in any way.
This is not true by default. Here's what's actually going on.
What Sub-Agents Actually Are
Sub-agents in OpenClaw run in a separate session, but they get the same "normal" tools as the main agent. This includes:
web_fetch- fetch and parse URLsweb_search- search the webbrowser- browser automationexec- shell commands (including curl/requests)- Most other tools the main agent has
What sub-agents don't get by default are session/system management tools like:
sessions_listsessions_historysessions_sendsessions_spawn
These restrictions prevent sub-agents from spawning their own sub-agents or interfering with other sessions. But internet access? That's enabled by default.
Why Your Sub-Agent Might Be Blocked
If your sub-agent truly can't fetch URLs, it's almost always one of two things:
1. An Allow-Only Tool Policy
If you (or a template you imported) configured an allow-only list for sub-agents and forgot to include web tools:
tools:
subagents:
tools:
allow:
- group:web # enables web_search + web_fetch
# or explicitly:
- web_fetch
- web_search2. Environment Network Restrictions
If your entire gateway/container blocks outbound network, sub-agents won't be able to reach URLs either.
The Fix
Option 1: Add Web Tools to Your Allow List
tools:
subagents:
tools:
allow:
- group:web # enables web_search + web_fetch
# or explicitly:
- web_fetch
- web_searchOption 2: Remove the Allow-Only Restriction
If you don't need a restrictive policy, remove the allow key entirely. Sub-agents will inherit normal tool access.
How to Diagnose
Run this command to see what's actually happening:
/subagents log <session-id> toolsLook for errors like:
"tool denied by policy"- your config is blocking it"tool not found"- the tool isn't available at all- Network/timeout errors - environment is blocking outbound
TL;DR
Sub-agents can absolutely do end-to-end work including fetching URLs, parsing sitemaps, making HTTP requests, etc. If yours can't, check your tools.subagents.tools.allow config鈥攜ou probably have an allow-list that's too restrictive.
Docs:
Thanks to Kaori on Discord for sparking this discussion!
Comments (0)
No comments yet. Be the first to comment!