Skip to content

CLI Commands

Flow provides five commands for working with your workflows.

flow check

Validate a workflow file without running it.

bash
flow check <file>

Checks for:

  • Syntax errors (typos, bad indentation)
  • Undefined variables
  • Unknown services
  • Duplicate step names

Example:

bash
flow check my-workflow.flow
my-workflow.flow is valid — no errors found.

If there are errors:

Error in my-workflow.flow, line 12:

    verify the email using EmailChecker

    I don't know what "EmailChecker" is. You haven't declared it
    in your services block.

    Did you mean "EmailVerifier"?

    Hint: Every service must be declared at the top of your file:
        services:
            EmailChecker is an API at "https://..."

flow run

Execute a workflow.

bash
flow run <file> [options]

Options

OptionDescription
--input <json>JSON string to use as the request object
--input-file <path>Read input from a file (.json, .csv, .xlsx, .xls)
--verboseShow detailed execution logs
--strict-envFail if any referenced env variables are missing
--mockUse mock services instead of real connectors
--output-log <path>Write structured JSON log to a file

Examples

bash
# Run with input data
flow run hello.flow --input '{"name": "Alice"}'

# Run with verbose output
flow run hello.flow --input '{"name": "Alice"}' --verbose

# Run with mock services (no real API calls)
flow run hello.flow --mock --input '{"name": "test"}'

# Require all env variables to be set
flow run my-workflow.flow --strict-env

Windows users

Windows CMD and PowerShell handle quotes differently. Use escaped double quotes instead:

bash
flow run hello.flow --input "{\"name\": \"Alice\"}"

Input format

The --input option accepts a JSON string. The data becomes available as the request object in your workflow:

bash
flow run hello.flow --input '{"username": "octocat", "limit": 10}'
txt
# In your workflow:
set user to request.username    # "octocat"
set max to request.limit        # 10

Input from a file

Instead of typing JSON on the command line, you can read input from a file using --input-file. This is often easier, especially for complex data or on Windows.

Supported formats: .json, .csv, .xlsx, .xls

bash
# From a JSON file
flow run hello.flow --input-file data.json

# From a CSV spreadsheet
flow run process-orders.flow --input-file orders.csv

# From an Excel spreadsheet
flow run process-orders.flow --input-file orders.xlsx

How spreadsheet data maps to input:

If your CSV or Excel file has one data row, it becomes a flat request object:

nameemailage
Alicealice@example.com30
txt
# In your workflow:
set name to request.name      # "Alice"
set email to request.email    # "alice@example.com"

If your file has multiple rows, they become a list you can loop through:

nameemailage
Alicealice@example.com30
Bobbob@example.com25
txt
# In your workflow:
log "Processing {request.count} records"
for each person in request.rows:
    log "{person.name} — {person.email}"

WARNING

You cannot use --input and --input-file at the same time. Pick one.

flow test

Run a workflow in test mode with mock services.

bash
flow test <file> [options]

Options

OptionDescription
--dry-runShow what would happen without executing
--verboseShow detailed execution logs
--output-log <path>Write structured JSON log to a file

Examples

bash
# Test with mock services
flow test my-workflow.flow

# Dry run — show execution plan
flow test my-workflow.flow --dry-run --verbose

Test mode automatically uses mock connectors, so no real API calls are made.

flow serve

Start an HTTP server to trigger workflows via webhook.

bash
flow serve <target> [options]

<target> can be a single .flow file or a directory containing .flow files.

Options

OptionDescription
--port <number>Port to listen on (default: 3000)
--verboseLog each incoming request
--mockUse mock services instead of real connectors
--auth-token <token>Require Bearer token for all requests (health check excluded)
--corsEnable CORS for all origins (*)
--cors-origin <origin>Enable CORS for a specific origin

Examples

bash
# Serve a single workflow
flow serve my-workflow.flow

# Serve all workflows in a directory
flow serve ./workflows/

# Custom port with verbose logging
flow serve my-workflow.flow --port 4000 --verbose

# Mock mode for development
flow serve my-workflow.flow --mock

# Require authentication
flow serve ./workflows/ --auth-token my-secret-token

# Enable CORS for browser clients
flow serve ./workflows/ --cors

# Enable CORS for a specific origin
flow serve ./workflows/ --cors-origin "https://my-app.example.com"

CORS can also be configured via the FLOW_CORS_ORIGIN environment variable.

Routes

Single file: The workflow is available at POST /.

Directory: Each .flow file becomes a route based on its filename:

  • email-verification.flowPOST /email-verification
  • order-processing.flowPOST /order-processing

Built-in endpoints

EndpointMethodDescription
/healthGETHealth check — returns { "status": "ok" }
/GETWorkflow metadata or list of workflows
/POSTExecute workflow (single file)
/:workflowPOSTExecute specific workflow (directory)

Triggering workflows

bash
curl -X POST http://localhost:3000 \
  -H "Content-Type: application/json" \
  -d '{"username": "octocat"}'

Windows users

On Windows, use escaped double quotes for JSON:

bash
curl -X POST http://localhost:3000 -H "Content-Type: application/json" -d "{\"username\": \"octocat\"}"

The JSON body becomes the request object in the workflow.

flow schedule

Run a workflow on a recurring schedule.

bash
flow schedule <file> [options]

You must provide either --every or --cron to specify the schedule.

Options

OptionDescription
--every <description>Human-readable schedule (e.g. "5 minutes", "day at 9:00")
--cron <expression>Standard cron expression (e.g. "*/5 * * * *")
--input <json>JSON string to use as the request object
--input-file <path>Read input from a file (.json, .csv, .xlsx, .xls)
--verboseShow detailed execution logs for each run
--mockUse mock services instead of real connectors
--output-log <dir>Write a timestamped JSON log file for each execution

Schedule formats

The --every option accepts these human-readable formats:

FormatCron equivalentDescription
"5 minutes"*/5 * * * *Every 5 minutes
"2 hours"0 */2 * * *Every 2 hours
"hour"0 * * * *Every hour on the hour
"day"0 0 * * *Daily at midnight
"day at 9:00"0 9 * * *Daily at 9:00 AM
"monday at 9:00"0 9 * * 1Every Monday at 9:00 AM
"friday"0 0 * * 5Every Friday at midnight

Day names can be full or abbreviated (mon, tue, wed, thu, fri, sat, sun). Case-insensitive.

Examples

bash
# Run every 5 minutes with mock services
flow schedule my-workflow.flow --every "5 minutes" --mock --verbose

# Run daily at 9 AM with input data
flow schedule my-report.flow --every "day at 9:00" --input '{"region": "us-east"}'

# Run with cron and write logs to a directory
flow schedule my-monitor.flow --cron "*/10 * * * *" --output-log ./logs/

# Weekly report every Monday at 9 AM
flow schedule my-report.flow --every "monday at 9:00" --verbose

Press Ctrl+C to stop the scheduler cleanly.

Environment variables

Flow reads .env files automatically from the current directory. You can also set environment variables in your shell:

bash
export API_KEY=your-key-here
flow run my-workflow.flow

Access them in your workflow with env.VARIABLE_NAME.

Released under the MIT License.