Skip to content

CLI Commands Reference

Complete reference of all CLI commands for executing and managing tasks.

Task Execution

apflow run

Execute a task or batch of tasks:

apflow run <batch_id> [OPTIONS]

Options: - --tasks <json> - Task definition in JSON format (required) - --inputs <json> - Task inputs as JSON (optional) - --tags <tag1,tag2> - Task tags (optional) - --priority <priority> - Task priority: low, normal, high (default: normal) - --user-id <id> - User ID (optional) - --timeout <seconds> - Execution timeout (default: 300) - --retry-count <count> - Retry failed tasks (default: 0) - --retry-delay <seconds> - Delay between retries (default: 1) - --parallel-count <count> - Run multiple tasks in parallel (default: 1) - --demo-mode - Run in demo mode (for testing)

Examples:

Basic execution:

apflow run batch-001 --tasks '[
  {
    "id": "task1",
    "name": "Check CPU",
    "schemas": {"method": "system_info_executor"},
    "inputs": {"resource": "cpu"}
  }
]'

With inputs and tags:

apflow run batch-002 --tasks '[{...}]' \
  --inputs '{"config": "value"}' \
  --tags production,monitoring \
  --priority high

Parallel execution:

apflow run batch-003 --tasks '[{...}, {...}, {...}]' \
  --parallel-count 3

With retry:

apflow run batch-004 --tasks '[{...}]' \
  --retry-count 3 \
  --retry-delay 5

Task Querying

apflow tasks list

List all tasks in database:

apflow tasks list [OPTIONS]

Options: - --user-id <id> - Filter by user ID - --status <status> - Filter by status: pending, running, completed, failed, cancelled - --batch-id <id> - Filter by batch ID - --limit <count> - Limit results (default: 100) - --offset <count> - Skip N results (default: 0) - --sort <field> - Sort by field: created_at, updated_at, status - --reverse - Reverse sort order

Examples:

List all tasks:

apflow tasks list

List by user:

apflow tasks list --user-id user123

List failed tasks:

apflow tasks list --status failed

List with pagination:

apflow tasks list --limit 50 --offset 0

List and sort by date:

apflow tasks list --sort created_at --reverse

apflow tasks status

Get status of a specific task:

apflow tasks status <task_id> [OPTIONS]

Options: - --include-details - Include full task details - --watch - Watch for changes (exit with Ctrl+C) - --watch-interval <seconds> - Polling interval when watching (default: 1)

Examples:

Check status:

apflow tasks status task-001

With full details:

apflow tasks status task-001 --include-details

Watch for completion:

apflow tasks status task-001 --watch
# Press Ctrl+C to stop watching

apflow tasks watch

Monitor task execution in real-time:

apflow tasks watch [OPTIONS]

Options: - --task-id <id> - Watch specific task - --all - Watch all running tasks - --batch-id <id> - Watch tasks in batch - --user-id <id> - Watch user's tasks - --interval <seconds> - Polling interval (default: 1)

Examples:

Watch single task:

apflow tasks watch --task-id task-001

Watch all running tasks:

apflow tasks watch --all

Watch batch:

apflow tasks watch --batch-id batch-001

Watch with slower polling:

apflow tasks watch --all --interval 5

apflow tasks history

View task execution history:

apflow tasks history <task_id> [OPTIONS]

Options: - --user-id <id> - Filter by user - --days <n> - Show last N days (default: 7) - --limit <count> - Limit results (default: 100)

Examples:

View task history:

apflow tasks history task-001

View recent history:

apflow tasks history task-001 --days 30

Task Cancellation

apflow tasks cancel

Cancel a running task:

apflow tasks cancel <task_id> [OPTIONS]

Options: - --force - Force cancellation even if stuck - --reason <text> - Cancellation reason - --wait - Wait for cancellation to complete (default: 5 seconds)

Examples:

Cancel task:

apflow tasks cancel task-001

Force cancel:

apflow tasks cancel task-001 --force

Cancel with reason:

apflow tasks cancel task-001 --reason "Incorrect parameters"

Task Management

apflow tasks create

Create a task without executing:

apflow tasks create [OPTIONS]

Options: - --name <name> - Task name (required) - --method <method> - Executor method (required) - --inputs <json> - Task inputs as JSON - --tags <tags> - Task tags - --description <text> - Task description - --batch-id <id> - Batch ID

Examples:

Create task:

apflow tasks create --name "CPU Check" --method system_info_executor \
  --inputs '{"resource": "cpu"}'

With batch and tags:

apflow tasks create --name "Memory Check" --method system_info_executor \
  --batch-id batch-001 --tags monitoring,system

apflow tasks update

Update task configuration:

apflow tasks update <task_id> [OPTIONS]

Options: - --name <name> - Update task name - --inputs <json> - Update task inputs - --tags <tags> - Update task tags - --status <status> - Update task status - --description <text> - Update description - --validate - Validate changes before applying

Examples:

Update task name:

apflow tasks update task-001 --name "New Name"

Update inputs:

apflow tasks update task-001 --inputs '{"resource": "memory"}'

Update with validation:

apflow tasks update task-001 --inputs '{"resource": "disk"}' --validate

apflow tasks delete

Delete a task:

apflow tasks delete <task_id> [OPTIONS]

Options: - --force - Delete without confirmation - --reason <text> - Deletion reason - --keep-logs - Keep execution logs after deletion

Examples:

Delete task:

apflow tasks delete task-001

Force delete:

apflow tasks delete task-001 --force

Delete and keep logs:

apflow tasks delete task-001 --force --keep-logs

apflow tasks copy

Copy a task:

apflow tasks copy <task_id> [OPTIONS]

Options: - --task-id <id> - New task ID (auto-generated if not provided) - --batch-id <id> - Target batch - --increment-name - Append " (copy)" to name - --clear-status - Start with pending status

Examples:

Copy task:

apflow tasks copy task-001

Copy to new batch:

apflow tasks copy task-001 --batch-id batch-002

Copy with incremented name:

apflow tasks copy task-001 --increment-name

Copy and reset status:

apflow tasks copy task-001 --clear-status

Flow Management

apflow flow run

Execute a flow (batch of tasks):

apflow flow run <flow_id> [OPTIONS]

Options: - --tasks <json> - Task definitions (required) - --inputs <json> - Flow inputs - --parallel-count <count> - Tasks to run in parallel - --skip-failed - Continue even if task fails - --timeout <seconds> - Flow timeout

Examples:

Run flow:

apflow flow run flow-001 --tasks '[{...}, {...}]'

Run with parallelism:

apflow flow run flow-002 --tasks '[...]' --parallel-count 3

Skip failed tasks:

apflow flow run flow-003 --tasks '[...]' --skip-failed

apflow flow status

Get flow execution status:

apflow flow status <flow_id>

Examples:

Check flow status:

apflow flow status flow-001

apflow flow cancel

Cancel a flow:

apflow flow cancel <flow_id> [OPTIONS]

Options: - --force - Force cancellation - --cancel-tasks - Cancel remaining tasks in flow

Examples:

Cancel flow:

apflow flow cancel flow-001

Cancel with tasks:

apflow flow cancel flow-001 --cancel-tasks

Query and Filtering

Common Filter Patterns

Filter by status:

apflow tasks list --status running
apflow tasks list --status completed
apflow tasks list --status failed
apflow tasks list --status cancelled

Filter by user:

apflow tasks list --user-id alice
apflow tasks list --user-id bob

Filter by batch:

apflow tasks list --batch-id batch-001
apflow tasks list --batch-id batch-002

Combine filters:

apflow tasks list --batch-id batch-001 --status failed
apflow tasks list --user-id alice --status running

Executor Methods

Common executor methods available:

system_info_executor

Get system information:

apflow run batch --tasks '[{
  "id": "t1",
  "name": "CPU Info",
  "schemas": {"method": "system_info_executor"},
  "inputs": {"resource": "cpu"}
}]'

Inputs: - resource: cpu, memory, disk, network

http_executor

Make HTTP requests:

apflow run batch --tasks '[{
  "id": "t1",
  "name": "API Call",
  "schemas": {"method": "http_executor"},
  "inputs": {
    "url": "https://api.example.com/data",
    "method": "GET"
  }
}]'

Inputs: - url: Target URL - method: GET, POST, PUT, DELETE - headers: HTTP headers (optional) - body: Request body (optional)

command_executor

Execute shell commands:

apflow run batch --tasks '[{
  "id": "t1",
  "name": "Run Script",
  "schemas": {"method": "command_executor"},
  "inputs": {
    "command": "ls -la",
    "timeout": 30
  }
}]'

Inputs: - command: Shell command to execute - timeout: Timeout in seconds

custom_executor

Custom business logic:

Implement custom executors by extending the executor framework. See Extending Guide for details.

Task Input Format

All task inputs are JSON:

{
  "id": "unique-task-id",
  "name": "Human Readable Name",
  "schemas": {
    "method": "executor_method_name"
  },
  "inputs": {
    "param1": "value1",
    "param2": "value2"
  },
  "tags": ["tag1", "tag2"],
  "priority": "high"
}

Fields: - id: Unique task identifier - name: Human-readable task name - schemas.method: Executor method to use - inputs: Method-specific parameters (object) - tags: Optional tags for organization - priority: low, normal, high (optional)

Output Format

Task output depends on the executor method. Examples:

system_info_executor output:

{
  "cpu": 45.2,
  "memory": 8192,
  "disk": 102400,
  "timestamp": "2024-01-15T10:30:00Z"
}

http_executor output:

{
  "status_code": 200,
  "headers": {...},
  "body": {...},
  "elapsed_time": 0.234
}

command_executor output:

{
  "stdout": "...",
  "stderr": "",
  "return_code": 0,
  "elapsed_time": 1.234
}

Command Aliases

Shorter versions of common commands:

  • apflow tasks listapflow tasks ls
  • apflow tasks statusapflow tasks st
  • apflow tasks cancelapflow tasks c
  • apflow tasks watchapflow tasks w
  • apflow flow runapflow f run

Error Handling

Common Errors

Error: "Task not found"

# Check if task ID is correct
apflow tasks list | grep task-id

Error: "Database connection error"

# Check database configuration
echo $DATABASE_URL
# Or check DuckDB file
ls ~/.aipartnerup/data/apflow.duckdb

Error: "Invalid task format"

# Validate JSON
echo '[{"id":"t1","name":"Task","schemas":{"method":"system_info_executor"},"inputs":{}}]' | python -m json.tool

Error: "Executor method not found"

# Check available methods
apflow executor list

Debugging

Debug mode:

# Enable verbose logging (preferred with APFLOW_ prefix)
export APFLOW_LOG_LEVEL=DEBUG
apflow run batch --tasks '[...]'

# Or use generic LOG_LEVEL (fallback)
export LOG_LEVEL=DEBUG
apflow run batch --tasks '[...]'

Check task details:

# Get full task information
apflow tasks status task-001 --include-details

Monitor execution:

# Watch task execution in real-time
apflow tasks watch --task-id task-001

Summary

  • Execute tasks: apflow run with JSON task definitions
  • Query tasks: apflow tasks list, status, watch
  • Manage tasks: create, update, delete, copy
  • Cancel tasks: apflow tasks cancel with force option
  • Monitor flows: apflow flow commands
  • Debug issues: Enable debug mode and check logs

All commands support JSON input/output for integration with other tools.