A CLI tool for setting up a self-hosted alternative to OpenAI’s Responses API. This API lets you create and manage open-ended AI responses for your applications, similar to OpenAI’s Responses API, but fully under your control.
Environment file location (default: .env in Git root or current directory)
API version (default: 0.0.1)
Ask for API configuration values (port, authentication key, timeout)
Create a .env file with your settings
Download or generate a docker-compose.yml file with the necessary services:
API server
Database
Management UI
Create a configuration file (open-responses.json) to track your settings
The CLI will automatically check for this configuration before running any other commands. If the configuration file doesn’t exist, it will prompt you to run the setup command first.
The CLI stores its configuration in open-responses.json, which can be located in:
The current directory
The parent directory
The Git repository root directory
The configuration file tracks:
All user-defined settings
Environment variable values
Creation and update timestamps (both camelCase and snake_case formats are supported)
File locations and version information
When you run setup again with an existing configuration, it will let you update your settings while preserving your previous values as defaults. If timestamps are missing from an existing configuration, they’ll be added automatically when the configuration is updated.
# List all API keys (masked):open-responses key list# Generate a new API key:open-responses key generate [type]# Update an API key:open-responses key set <type> [value]
For more advanced operations, use the compose command group:
Copy
open-responses compose <command> [args...]
Available commands include:
Copy
# Start services with additional options:open-responses compose up [flags]# Stop and clean up services:open-responses compose down [flags]# View logs with custom options:open-responses compose logs [flags] [SERVICE...]# List containers:open-responses compose ps [flags]# Build services:open-responses compose build [flags] [SERVICE...]# Restart services:open-responses compose restart [flags] [SERVICE...]# Pull service images:open-responses compose pull [flags] [SERVICE...]# Execute commands in containers:open-responses compose exec [flags] SERVICE COMMAND [ARGS...]# Run one-off commands:open-responses compose run [flags] SERVICE COMMAND [ARGS...]# Validate Docker Compose configuration:open-responses compose config [flags]# View processes in containers:open-responses compose top [SERVICE...]# Monitor resource usage:open-responses compose stats [SERVICE...]
Each compose command is a direct proxy to the equivalent Docker Compose command and accepts all the same flags and arguments. This provides full access to Docker Compose functionality when needed.
For detailed examples of each command, use the --help flag:
Copy
open-responses compose up --helpopen-responses compose logs --help
This CLI is built with Go and compiled to native binaries for Windows, macOS, and Linux.
When installed via npm or pip, the appropriate binary for your platform is used automatically.
The service itself runs in Docker containers, providing a compatible alternative to OpenAI’s Responses API.
You can manually run code formatting and linting using these commands:
Copy
# Format all codenpm run format:all# Lint all codenpm run lint:all# Format/lint individual languagesnpm run format # JavaScript/JSON/Markdown filesnpm run py:format # Python filesnpm run go:format # Go filesnpm run lint # JavaScript filesnpm run py:lint # Python files