Open Responses CLI
A CLI tool for setting up a self-hosted alternative to OpenAI’s Responses API. This API lets you create and manage open-ended AI responses for your applications, similar to OpenAI’s Responses API, but fully under your control.Features
- Easy setup with Docker Compose
- Compatible API endpoints with OpenAI’s Responses API
- Management UI for creating, viewing, and managing responses
- Local data storage with PostgreSQL
- Customizable authentication and timeout settings
Installation
You can install this CLI using Go, npm, or Python:Using Go
Using npm
Using Python
Usage
First-time Setup
Before using any commands, you must run the setup command:- Ask for configuration settings with default values:
- Host (default: 127.0.0.1)
- Port (default: 8080)
- Docker tag (default: latest_responses)
- Base Docker Compose URI (default: https://u.julep.ai/responses-compose.yaml)
- Environment file location (default: .env in Git root or current directory)
- API version (default: 0.0.1)
- Ask for API configuration values (port, authentication key, timeout)
- Create a .env file with your settings
- Download or generate a docker-compose.yml file with the necessary services:
- API server
- Database
- Management UI
- Create a configuration file (open-responses.json) to track your settings
Configuration File
The CLI stores its configuration inopen-responses.json
, which can be located in:
- The current directory
- The parent directory
- The Git repository root directory
- All user-defined settings
- Environment variable values
- Creation and update timestamps (both
camelCase
andsnake_case
formats are supported) - File locations and version information
setup
again with an existing configuration, it will let you update your settings while preserving your previous values as defaults. If timestamps are missing from an existing configuration, they’ll be added automatically when the configuration is updated.
API Configuration
The API service includes the following configuration options with sensible defaults:Basic Settings
HOST
: Host address for the API (default:127.0.0.1
)PORT
: Port for the UI service (default:8080
)RESPONSES_API_PORT
: Port for the API service (default:8080
)DOCKER_TAG
: Docker image tag (default:latest_responses
)API_VERSION
: API version (default:0.0.1
)
Performance & Limits
NODE_ENV
: Node.js environment (default:production
)LOG_LEVEL
: Logging level (default:info
)REQUEST_TIMEOUT
: API request timeout in ms (default:120000
- 2 minutes)MAX_PAYLOAD_SIZE
: Maximum request payload size (default:10mb
)RATE_LIMIT_WINDOW
: Rate limit window in ms (default:60000
- 1 minute)RATE_LIMIT_MAX
: Maximum requests per rate limit window (default:100
)
Resource Allocation
The Docker Compose configuration also includes resource limits to ensure stable operation:- API Service: 1 CPU, 2GB memory (min: 0.25 CPU, 512MB)
- Database: 1 CPU, 1GB memory (min: 0.1 CPU, 256MB)
- Redis: 0.5 CPU, 768MB memory (min: 0.1 CPU, 128MB)
- UI: 0.5 CPU, 512MB memory (min: 0.1 CPU, 128MB)
docker-compose.yml
file if needed.
User-Friendly Commands
The CLI provides easy-to-use commands for common operations:Starting the service
- Pulls Docker images for your specific architecture
- Starts all services in foreground mode with log streaming
- Automatically stops all services when you press Ctrl+C
- Shows the status of services after startup
- Displays access URLs for the API and admin UI
Stopping the service
open-responses compose down
).
Checking service status
- Running state and health status
- Uptime information
- Resource usage summary
- Access URLs
Viewing logs
- Follows logs in real-time
- Shows colorized output
- Displays last 100 lines by default
- Can target specific services
Initializing a new project
- Creates directory structure (data, config, logs)
- Generates helpful documentation files
- Runs interactive configuration
- Sets up Docker Compose with best practices
Managing API keys
Updating components
- Updates Docker Compose configuration
- Pulls latest Docker images
- Backs up your configuration
Advanced Docker Compose Commands
For more advanced operations, use the compose command group:--help
flag:
API Endpoints
Once your service is running, the following endpoints will be available:POST /v1/responses
- Create a new responseGET /v1/responses/{id}
- Retrieve a responseGET /v1/responses
- List all responsesDELETE /v1/responses/{id}
- Delete a response
http://localhost:8080
(or your configured port).
Requirements
- Docker must be installed on your system
- Docker Compose must be installed (either as a standalone binary or integrated plugin)
- Docker Compose V2 (≥ 2.21.0) is recommended for best compatibility
- Docker Compose V1 is supported but with limited functionality
- No other runtime dependencies required (no Node.js or Python needed for running the service)
How it works
This CLI is built with Go and compiled to native binaries for Windows, macOS, and Linux. When installed via npm or pip, the appropriate binary for your platform is used automatically. The service itself runs in Docker containers, providing a compatible alternative to OpenAI’s Responses API.Development
Project Structure
main.go
: Core CLI functionality built with Goopen_responses/__init__.py
: Python wrapper for binary distributionscripts/postinstall.js
: Node.js script for platform detection and setupbin/
: Directory for compiled binaries
Building from Source
Build for your current platform:bin/
directory:
bin/open-responses-linux
bin/open-responses-macos
bin/open-responses-win.exe
Installing for Development
For Python:Development Guidelines
This project follows strict formatting and linting guidelines to maintain code quality. We use:- Go: Standard Go formatting with
go fmt
- Python: Ruff for linting and formatting
- JavaScript: ESLint and Prettier for linting and formatting
Setting Up Development Environment
- Install development dependencies:
- The git hooks will automatically run formatting and linting checks before each commit.