Skip to main content
The pflow settings command group manages pflow configuration stored in ~/.pflow/settings.json. This includes API keys, environment variables, and node filtering.
Your commands. These are setup commands that you run directly - not your agent. API keys and security settings should always be configured by you, never by an AI agent.

Commands

CommandDescription
initInitialize settings with defaults
showDisplay current settings
set-envSet an environment variable
unset-envRemove an environment variable
list-envList all environment variables
allowAdd an allow pattern
denyAdd a deny pattern
removeRemove a pattern
checkCheck if a node is allowed
resetReset to defaults
llm showShow LLM model settings
llm set-defaultSet default model for all features
llm set-discoverySet model for discovery commands
llm set-filteringSet model for smart filtering
llm unsetRemove an LLM model setting
registry output-modeSet registry run output display mode

pflow settings init

Initialize settings file with defaults.
pflow settings init
Creates ~/.pflow/settings.json with default configuration. Prompts for confirmation if the file already exists. Default settings:
{
  "version": "1.0.0",
  "registry": {
    "nodes": {
      "allow": ["*"],
      "deny": []
    },
    "output_mode": "smart"
  },
  "llm": {
    "default_model": null,
    "discovery_model": null,
    "filtering_model": null
  },
  "env": {}
}

pflow settings show

Display current settings with sensitive values masked.
pflow settings show
Output:
Settings file: ~/.pflow/settings.json

Current settings:
{
  "version": "1.0.0",
  "registry": {
    "nodes": {
      "allow": ["*"],
      "deny": ["test.*"]
    }
  },
  "llm": {
    "default_model": "gpt-5.2",
    "discovery_model": null,
    "filtering_model": null
  },
  "env": {
    "ANTHROPIC_API_KEY": "sk-***",
    "log_level": "debug"
  }
}
Sensitive values (API keys, tokens, secrets) are automatically masked. Use pflow settings list-env --show-values to see full values.

pflow settings set-env

Set an environment variable for pflow workflows.
pflow settings set-env <KEY> <VALUE>
Arguments:
  • KEY - Environment variable name
  • VALUE - Environment variable value
Examples:
# Set API keys for LLM providers
pflow settings set-env OPENAI_API_KEY "sk-..."
pflow settings set-env ANTHROPIC_API_KEY "sk-ant-..."

# Set other variables
pflow settings set-env GITHUB_TOKEN "ghp_..."
Output:
✓ Set environment variable: OPENAI_API_KEY
   Value: sk-***
Security: Always set API keys yourself - never let AI agents run this command.
Alternative: If you use Simon Willison’s llm, pflow will use those keys automatically. Run llm keys set anthropic instead.

pflow settings unset-env

Remove an environment variable.
pflow settings unset-env <KEY>
Arguments:
  • KEY - Environment variable name to remove
Example:
pflow settings unset-env GITHUB_TOKEN

pflow settings list-env

List all configured environment variables.
pflow settings list-env [--show-values]
Options:
  • --show-values - Display full unmasked values (use with caution)
Examples:
# List with masked values (safe)
pflow settings list-env

# List with full values (sensitive!)
pflow settings list-env --show-values
Output (masked):
Environment variables:
  OPENAI_API_KEY: sk-***
  GITHUB_TOKEN: ghp***
  LOG_LEVEL: debug
Only use --show-values in secure environments. Never let agents access unmasked credentials.

pflow settings allow

Add an allow pattern for node filtering.
pflow settings allow <PATTERN>
Arguments:
  • PATTERN - Glob-style pattern for nodes to allow
Examples:
# Allow all file nodes
pflow settings allow "pflow.nodes.file.*"

# Allow specific MCP tools
pflow settings allow "mcp-github-*"

# Allow specific node
pflow settings allow "llm"
Pattern syntax:
  • * matches any characters
  • ? matches single character
  • [seq] matches any character in seq

pflow settings deny

Add a deny pattern for node filtering.
pflow settings deny <PATTERN>
Arguments:
  • PATTERN - Glob-style pattern for nodes to deny
Examples:
# Block test nodes
pflow settings deny "pflow.nodes.test.*"

# Block dangerous operations
pflow settings deny "shell"
pflow settings deny "*-delete-*"
Deny patterns take precedence over allow patterns.

pflow settings remove

Remove a pattern from allow or deny list.
pflow settings remove <PATTERN> [--allow|--deny]
Arguments:
  • PATTERN - Pattern to remove
Options:
  • --allow - Remove from allow list (default)
  • --deny - Remove from deny list
Examples:
# Remove from deny list
pflow settings remove "test.*" --deny

# Remove from allow list
pflow settings remove "file.*" --allow

pflow settings check

Check if a node would be included based on current settings.
pflow settings check <NODE_NAME>
Arguments:
  • NODE_NAME - Node name to check
Example:
pflow settings check read-file
Output (included):
✓ Node 'read-file' would be INCLUDED

  Matched allow patterns: file.*, *
Output (excluded):
✗ Node 'echo' would be EXCLUDED

  Matched deny patterns: test.*

pflow settings reset

Reset settings to defaults.
pflow settings reset
Prompts for confirmation before resetting. This removes all custom settings including API keys.
This deletes ALL custom settings including environment variables and API keys.

LLM model settings

These commands let you override which models pflow uses for its internal features. By default, pflow auto-detects based on your configured API keys - these are optional.

pflow settings llm show

Display LLM model settings with resolution status.
pflow settings llm show
Output:
LLM Model Settings:

  default_model:    gpt-5.2 (configured)
  discovery_model:  (using default_model → gpt-5.2)
  filtering_model:  (using default_model → gpt-5.2)

Resolution: setting → default_model → llm CLI default → auto-detect → error

To configure:
  pflow settings llm set-default <model>
  pflow settings llm set-discovery <model>
  pflow settings llm set-filtering <model>

pflow settings llm set-default

Override the auto-detected model for all pflow LLM usage.
pflow settings llm set-default <MODEL>
Arguments:
  • MODEL - Model identifier (e.g., gpt-5.2, anthropic/claude-sonnet-4-5, gemini-3-flash-preview)
When set, this model is used instead of auto-detection for:
  • LLM nodes in workflows (when no model specified in params)
  • Discovery commands (when discovery_model not set)
  • Smart filtering (when filtering_model not set)
Examples:
# Set OpenAI model
pflow settings llm set-default gpt-5.2

# Set Anthropic model
pflow settings llm set-default anthropic/claude-sonnet-4-5

# Set Google model
pflow settings llm set-default gemini-3-flash-preview
This is optional - pflow auto-detects a model based on your API keys. Use this if you want a specific model instead of the auto-detected one.

pflow settings llm set-discovery

Set the model for discovery commands (pflow registry discover, pflow workflow discover).
pflow settings llm set-discovery <MODEL>
Arguments:
  • MODEL - Model identifier
Example:
# Use a fast, cheap model for discovery
pflow settings llm set-discovery gemini-3-flash-preview

pflow settings llm set-filtering

Set the model for smart field filtering (used when smart output mode filters large API responses).
pflow settings llm set-filtering <MODEL>
Arguments:
  • MODEL - Model identifier
Example:
# Use a fast, cheap model for filtering
pflow settings llm set-filtering gemini-2.5-flash-lite
Smart filtering is a simple task - use a fast, cheap model.Recommended (fast + cheap):
ProviderModelOverheadNotes
Googlegemini-2.5-flash-lite~2-3sBest budget option
OpenAIgpt-5-mini~1-2sRunner-up budget
Anthropicclaude-haiku-4-5~1-2sThird place budget
Alternative (higher cost):
ProviderModelOverheadNotes
Anthropicclaude-sonnet-4-5~1-2sBest premium option
Googlegemini-3-flash-preview~2-3sRunner-up premium
OpenAIgpt-5.2~5-6sSlower, not recommended
All models produce equivalent quality for this task. Timings are approximate and vary with network latency.

pflow settings llm unset

Remove an LLM model setting, reverting to auto-detection.
pflow settings llm unset {default|discovery|filtering|all}
Arguments:
  • SETTING - Which setting to remove: default, discovery, filtering, or all
Examples:
# Clear default model
pflow settings llm unset default

# Clear discovery model (will use default_model or auto-detect)
pflow settings llm unset discovery

# Clear all LLM settings
pflow settings llm unset all

Model resolution order

pflow uses the same resolution order for all LLM usage (discovery, filtering, and workflow LLM nodes):
  1. Explicit setting (workflow params or feature-specific setting)
  2. default_model from settings
  3. llm library default (llm models default)
  4. Auto-detect from API keys
  5. Error with setup instructions
Auto-detected defaults by provider:
ProviderDefault model
Anthropicanthropic/claude-sonnet-4-5
Googlegemini/gemini-3-flash-preview
OpenAIgpt-5.2
Most users just need an API key configured. pflow auto-detects the appropriate model. Use these commands only to override the auto-detected model.

Registry settings

Configure how pflow registry run displays output.

pflow settings registry output-mode

Show or set the output display mode for pflow registry run.
pflow settings registry output-mode [MODE]
Arguments:
  • MODE (optional) - One of smart, structure, or full. If omitted, shows current mode.
Modes:
ModeDescription
smart (default)Shows template paths with values. Uses LLM to filter large outputs (>30 fields) to relevant fields.
structureShows template paths only (no values). No filtering - shows all fields. Fast, no LLM overhead.
fullShows all fields with full values, no truncation or filtering.
Examples:
# Show current mode
pflow settings registry output-mode

# Set to smart (default - shows values with truncation)
pflow settings registry output-mode smart

# Set to structure-only (paths without values)
pflow settings registry output-mode structure

# Set to full (all values, no truncation)
pflow settings registry output-mode full
Use structure mode when working with sensitive data — it shows types and paths without actual values, so your agent can build workflows without ever seeing the data itself. Avoid full mode with AI agents — it shows all values without truncation and can consume excessive tokens.
See registry run output modes for detailed examples of each mode.

How node filtering works

Node filtering uses allow and deny patterns evaluated in this order:
  1. Test nodes - Hidden by default (enable with PFLOW_INCLUDE_TEST_NODES=true)
  2. Deny patterns - Block matching nodes (highest precedence)
  3. Allow patterns - Include matching nodes
  4. Default - Include if * in allow list
Example configuration:
{
  "registry": {
    "nodes": {
      "allow": ["*"],
      "deny": ["test.*", "shell"]
    }
  }
}
This allows all nodes except test nodes and the shell node.

Environment variable precedence

When workflows need parameters, pflow looks in this order:
  1. CLI parameters (key=value arguments)
  2. Settings environment variables (pflow settings set-env)
  3. Workflow defaults
  4. Error if required and not found
Example:
# Store API key once
pflow settings set-env OPENAI_API_KEY "sk-..."

# Workflows automatically use it
pflow my-llm-workflow  # No need to pass --param

Sensitive parameter detection

These keys are automatically masked in output:
  • password, passwd, pwd
  • token, api_token, access_token, auth_token
  • api_key, apikey, api-key
  • secret, client_secret, secret_key
  • private_key, ssh_key
Matching is case-insensitive.

File locations

PathPurpose
~/.pflow/settings.jsonSettings file