Open Responses API Examples
Below are practical examples showing how to use the Julep Open Responses API for various use cases.- The Open Responses API requires self-hosting. See the installation guide below.
- Being in Alpha, the API is subject to change. Check back frequently for updates.
- For more context, see the OpenAI Responses API documentation.
API Key Configuration
RESPONSE_API_KEY
is the API key that you set in the.env
file.
Model Selection
- While using models other than OpenAI, one might need to add the
provider/
prefix to the model name. - For supported providers, see the LiteLLM Providers documentation.
Environment Setup
- Add the relevant provider keys to the
.env
file to use their respective models.
Setup
First, set up your environment and create a client:Using Reasoning Features
Enhance your model’s reasoning capabilities for solving complex problems:Using Web Search Tool
Python
Maintaining Conversation History
Create a continuous conversation by referencing previous responses:Retrieving Past Responses
Access previously created responses by their ID:Next Steps
You’ve got Open Responses running – here’s what to explore next:- Learn more about the Open Responses API Concepts – To learn more about core concepts and how Open Responses structures these building blocks in your applications.
- Learn more about the Open Responses API Roadmap – To see upcoming features.
- OpenAI’s Responses API Documentation - For more insight into the original API that inspired Julep’s Responses
- OpenAI Agents SDK - Explore OpenAI’s Agents SDK that works with Julep’s Open Responses API
- Learn more about Julep - To learn more about Julep and its features
- Julep’s GitHub Repository - To contribute to the project