CLI Reference¶
Director-AI ships a command-line interface for scoring, serving, benchmarking, and project scaffolding.
Commands¶
Scoring¶
# Score a single prompt/response pair
director-ai review "What is the capital of France?" "The capital is Berlin."
# Process with agent (generate + score)
director-ai process "What is the refund policy?"
# Batch score from JSONL
director-ai batch input.jsonl --output results.jsonl
Server¶
# Start REST server
director-ai serve --port 8080 --workers 4
# Start gRPC server
director-ai grpc --port 50051
# Health check
director-ai health --url http://localhost:8080
Configuration¶
# Show current config
director-ai config
# Show a named profile
director-ai config --profile medical
# Generate YAML config
director-ai config --export config.yaml
Project Scaffolding¶
# Create a new project with config, facts, and guard script
director-ai quickstart --profile medical
cd director_guard/
python guard.py
Benchmarking¶
# Run latency benchmark
director-ai bench
# Run with specific dataset
director-ai bench --dataset e2e
# Run regression suite
python -m benchmarks.regression_suite
Model Export¶
# Export to ONNX
director-ai export --format onnx --output ./models/onnx/
# Export to TensorRT
director-ai export --format tensorrt --output ./models/trt/
Fine-Tuning¶
Threshold Tuning¶
Version¶
Global Options¶
| Flag | Description |
|---|---|
--config PATH |
YAML config file |
--profile NAME |
Named profile (fast, thorough, medical, etc.) |
--verbose |
Enable debug logging |
--json |
JSON output format |