Agent commands. Your AI agent uses this node in workflows. You don’t configure it directly.
Parameters
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
command | str | Yes | - | Shell command to execute |
stdin | any | No | - | Input data for the command |
cwd | str | No | Current dir | Working directory |
env | dict | No | {} | Additional environment variables |
timeout | int | No | 30 | Maximum execution time in seconds |
ignore_errors | bool | No | false | Continue workflow on non-zero exit |
Output
| Key | Type | Description |
|---|---|---|
stdout | str | Command output (UTF-8 or base64 if binary) |
stdout_is_binary | bool | true if stdout is base64-encoded |
stderr | str | Error output (UTF-8 or base64 if binary) |
stderr_is_binary | bool | true if stderr is base64-encoded |
exit_code | int | Exit code (0=success, -1=timeout, -2=execution failure) |
error | str | Error message (only on timeout/failure) |
Using stdin for data
Correct - data via stdin:stdin type handling
| Input type | Conversion |
|---|---|
| str | Used as-is |
| dict/list | Serialized to JSON |
| int/float | Converted to string |
| bool | Lowercase string (true/false) |
| bytes | Decoded UTF-8 (fallback: latin-1) |
Validation and error handling
Your agent handles this. pflow validates commands when the workflow is created, not at runtime. If structured data ends up in a command string, the error message tells your agent exactly what to move to
stdin and why.stdin instead:
Technical details
Technical details
When a workflow is created, pflow checks if command templates contain dict or list variables. If found, you’ll see an error like:This validation happens at workflow creation time (compile-time), not during execution, so you get immediate feedback.Why this matters: Shell parsers expect text, and JSON data contains special characters (
{, }, ", spaces) that have meaning to the shell. Even with careful quoting, it’s fragile. The stdin approach is safer and more reliable.Security
Blocked patterns
These commands are rejected immediately with an error:rm -rf /and variants (recursive system deletion)dd if=/dev/zero of=/dev/sda(device operations):(){:|:&};:(fork bombs)chmod -R 777 /(dangerous permissions)sudo rm -rf /(privileged dangerous commands)
Warning patterns
These trigger warnings but execute unlessPFLOW_SHELL_STRICT=true:
sudo,su -shutdown,reboot,haltsystemctl poweroff
Smart error handling
Some commands return non-zero exit codes for valid “not found” results. The shell node treats these as success:| Pattern | Exit code | Reason |
|---|---|---|
ls *.txt (no matches) | 1 | Empty glob is valid |
grep pattern file | 1 | Pattern not found is valid |
which nonexistent | 1 | Command check |
command -v foo | 1 | Existence check |
Examples
Basic command
Process JSON with jq
With environment variables
Working directory
Ignoring errors
Error handling
| Exit code | Meaning |
|---|---|
| 0 | Success |
| 1+ | Command-specific error |
| -1 | Timeout |
| -2 | Execution failure |
error action on non-zero exit (unless ignore_errors is true or it’s an auto-handled pattern like grep).
Recommended tools
pflow’s template variables handle most data access - you can use${api.response.items[0].name} to access nested fields and array elements directly without shell commands.
Shell is needed when you need to:
- Iterate over arrays -
jq '.items[].name'(templates can’t do wildcards) - Filter or transform -
jq 'select(.active)',sort,uniq - Compute values -
wc -l, arithmetic
grep, awk, cut, sort, head, tail, curl.
For JSON processing, install jq:

