CLI Commands
Flow provides five commands for working with your workflows.
flow check
Validate a workflow file without running it.
flow check <file>Checks for:
- Syntax errors (typos, bad indentation)
- Undefined variables
- Unknown services
- Duplicate step names
Example:
flow check my-workflow.flowmy-workflow.flow is valid — no errors found.If there are errors:
Error in my-workflow.flow, line 12:
verify the email using EmailChecker
I don't know what "EmailChecker" is. You haven't declared it
in your services block.
Did you mean "EmailVerifier"?
Hint: Every service must be declared at the top of your file:
services:
EmailChecker is an API at "https://..."flow run
Execute a workflow.
flow run <file> [options]Options
| Option | Description |
|---|---|
--input <json> | JSON string to use as the request object |
--input-file <path> | Read input from a file (.json, .csv, .xlsx, .xls) |
--verbose | Show detailed execution logs |
--strict-env | Fail if any referenced env variables are missing |
--mock | Use mock services instead of real connectors |
--output-log <path> | Write structured JSON log to a file |
Examples
# Run with input data
flow run hello.flow --input '{"name": "Alice"}'
# Run with verbose output
flow run hello.flow --input '{"name": "Alice"}' --verbose
# Run with mock services (no real API calls)
flow run hello.flow --mock --input '{"name": "test"}'
# Require all env variables to be set
flow run my-workflow.flow --strict-envWindows users
Windows CMD and PowerShell handle quotes differently. Use escaped double quotes instead:
flow run hello.flow --input "{\"name\": \"Alice\"}"Input format
The --input option accepts a JSON string. The data becomes available as the request object in your workflow:
flow run hello.flow --input '{"username": "octocat", "limit": 10}'# In your workflow:
set user to request.username # "octocat"
set max to request.limit # 10Input from a file
Instead of typing JSON on the command line, you can read input from a file using --input-file. This is often easier, especially for complex data or on Windows.
Supported formats: .json, .csv, .xlsx, .xls
# From a JSON file
flow run hello.flow --input-file data.json
# From a CSV spreadsheet
flow run process-orders.flow --input-file orders.csv
# From an Excel spreadsheet
flow run process-orders.flow --input-file orders.xlsxHow spreadsheet data maps to input:
If your CSV or Excel file has one data row, it becomes a flat request object:
| name | age | |
|---|---|---|
| Alice | alice@example.com | 30 |
# In your workflow:
set name to request.name # "Alice"
set email to request.email # "alice@example.com"If your file has multiple rows, they become a list you can loop through:
| name | age | |
|---|---|---|
| Alice | alice@example.com | 30 |
| Bob | bob@example.com | 25 |
# In your workflow:
log "Processing {request.count} records"
for each person in request.rows:
log "{person.name} — {person.email}"WARNING
You cannot use --input and --input-file at the same time. Pick one.
flow test
Run a workflow in test mode with mock services.
flow test <file> [options]Options
| Option | Description |
|---|---|
--dry-run | Show what would happen without executing |
--verbose | Show detailed execution logs |
--output-log <path> | Write structured JSON log to a file |
Examples
# Test with mock services
flow test my-workflow.flow
# Dry run — show execution plan
flow test my-workflow.flow --dry-run --verboseTest mode automatically uses mock connectors, so no real API calls are made.
flow serve
Start an HTTP server to trigger workflows via webhook.
flow serve <target> [options]<target> can be a single .flow file or a directory containing .flow files.
Options
| Option | Description |
|---|---|
--port <number> | Port to listen on (default: 3000) |
--verbose | Log each incoming request |
--mock | Use mock services instead of real connectors |
--auth-token <token> | Require Bearer token for all requests (health check excluded) |
--cors | Enable CORS for all origins (*) |
--cors-origin <origin> | Enable CORS for a specific origin |
Examples
# Serve a single workflow
flow serve my-workflow.flow
# Serve all workflows in a directory
flow serve ./workflows/
# Custom port with verbose logging
flow serve my-workflow.flow --port 4000 --verbose
# Mock mode for development
flow serve my-workflow.flow --mock
# Require authentication
flow serve ./workflows/ --auth-token my-secret-token
# Enable CORS for browser clients
flow serve ./workflows/ --cors
# Enable CORS for a specific origin
flow serve ./workflows/ --cors-origin "https://my-app.example.com"CORS can also be configured via the FLOW_CORS_ORIGIN environment variable.
Routes
Single file: The workflow is available at POST /.
Directory: Each .flow file becomes a route based on its filename:
email-verification.flow→POST /email-verificationorder-processing.flow→POST /order-processing
Built-in endpoints
| Endpoint | Method | Description |
|---|---|---|
/health | GET | Health check — returns { "status": "ok" } |
/ | GET | Workflow metadata or list of workflows |
/ | POST | Execute workflow (single file) |
/:workflow | POST | Execute specific workflow (directory) |
Triggering workflows
curl -X POST http://localhost:3000 \
-H "Content-Type: application/json" \
-d '{"username": "octocat"}'Windows users
On Windows, use escaped double quotes for JSON:
curl -X POST http://localhost:3000 -H "Content-Type: application/json" -d "{\"username\": \"octocat\"}"The JSON body becomes the request object in the workflow.
flow schedule
Run a workflow on a recurring schedule.
flow schedule <file> [options]You must provide either --every or --cron to specify the schedule.
Options
| Option | Description |
|---|---|
--every <description> | Human-readable schedule (e.g. "5 minutes", "day at 9:00") |
--cron <expression> | Standard cron expression (e.g. "*/5 * * * *") |
--input <json> | JSON string to use as the request object |
--input-file <path> | Read input from a file (.json, .csv, .xlsx, .xls) |
--verbose | Show detailed execution logs for each run |
--mock | Use mock services instead of real connectors |
--output-log <dir> | Write a timestamped JSON log file for each execution |
Schedule formats
The --every option accepts these human-readable formats:
| Format | Cron equivalent | Description |
|---|---|---|
"5 minutes" | */5 * * * * | Every 5 minutes |
"2 hours" | 0 */2 * * * | Every 2 hours |
"hour" | 0 * * * * | Every hour on the hour |
"day" | 0 0 * * * | Daily at midnight |
"day at 9:00" | 0 9 * * * | Daily at 9:00 AM |
"monday at 9:00" | 0 9 * * 1 | Every Monday at 9:00 AM |
"friday" | 0 0 * * 5 | Every Friday at midnight |
Day names can be full or abbreviated (mon, tue, wed, thu, fri, sat, sun). Case-insensitive.
Examples
# Run every 5 minutes with mock services
flow schedule my-workflow.flow --every "5 minutes" --mock --verbose
# Run daily at 9 AM with input data
flow schedule my-report.flow --every "day at 9:00" --input '{"region": "us-east"}'
# Run with cron and write logs to a directory
flow schedule my-monitor.flow --cron "*/10 * * * *" --output-log ./logs/
# Weekly report every Monday at 9 AM
flow schedule my-report.flow --every "monday at 9:00" --verbosePress Ctrl+C to stop the scheduler cleanly.
Environment variables
Flow reads .env files automatically from the current directory. You can also set environment variables in your shell:
export API_KEY=your-key-here
flow run my-workflow.flowAccess them in your workflow with env.VARIABLE_NAME.