Reuse your Codex CLI authentication to power Droid (Factory CLI) and any OpenAI-compatible application through a local proxy server.
English | 中文
git clone https://github.com/tytsxai/cli-proxy-droid-integration.git
cd cli-proxy-droid-integration
cp CLIProxyAPI/config.yaml.example CLIProxyAPI/config.yaml
# Edit config.yaml with your API credentials
./setup.sh
droid # Then type /model to select custom modelThe Problem: You have a valid Codex CLI subscription/authentication, but you want to use it with other tools like Droid (Factory CLI), or build your own applications.
The Solution: This proxy server extracts your Codex CLI credentials and exposes them as a standard OpenAI-compatible API on localhost:8317.
┌─────────────────┐ ┌─────────────────────┐ ┌──────────────────┐
│ Droid / Apps │────▶│ Local Proxy │────▶│ AI Provider │
│ │ │ (localhost:8317) │ │ (via Codex) │
└─────────────────┘ └─────────────────────┘ └──────────────────┘
If you have Codex CLI credentials and want to use them in Droid:
Codex CLI Auth → This Proxy → Droid (Factory CLI)
The proxy exposes a standard OpenAI-compatible API. You can use it with:
- Python applications using
openailibrary - Node.js applications using OpenAI SDK
- Any tool that supports custom OpenAI endpoints
- curl for direct API calls
# Example: Use with any OpenAI-compatible client
export OPENAI_API_BASE=http://localhost:8317/v1
export OPENAI_API_KEY=dummy
# Now any OpenAI client will use your Codex credentials
python your_app.pyRun the proxy once, use it everywhere:
┌─── Droid
│
localhost:8317 ─────┼─── Your Python Script
│
├─── VS Code Extension
│
└─── Any OpenAI Client
Before you begin, ensure you have:
- Factory CLI (Droid) installed on your system
- API credentials from your third-party provider
- macOS or Linux operating system
- Basic terminal knowledge
The proxy auto-detects credentials from these locations (in order):
| Priority | Location | Key |
|---|---|---|
| 1 | Environment | CODEX_API_KEY or OPENAI_API_KEY |
| 2 | ~/.codex/config.toml |
experimental_bearer_token |
| 3 | ~/.codex/auth.json |
OPENAI_API_KEY |
cd ~/Desktop
git clone https://github.com/tytsxai/cli-proxy-droid-integration.git
cd cli-proxy-droid-integrationcp CLIProxyAPI/config.yaml.example CLIProxyAPI/config.yamlOpen CLIProxyAPI/config.yaml in your editor:
nano CLIProxyAPI/config.yaml
# Or use: vim, code, or any text editorFind the codex-api-key section and update it:
codex-api-key:
- api-key: "YOUR_ACTUAL_API_KEY"
prefix: "provider"
base-url: "https://your-provider-api-endpoint.com/api"chmod +x setup.sh
./setup.shThis script will:
- Sync your credentials
- Configure Factory CLI custom models
- Start the local proxy server on port 8317
./verify-integration.shYou should see all checks passing with green checkmarks.
droidOnce inside Droid:
- Type
/modelto see available models - Select your custom model (e.g.,
gpt-5.2) - Start chatting!
# Start the proxy (if not running)
./setup.sh
# Check status
./verify-integration.sh
# View proxy logs
tail -f CLIProxyAPI/proxy.log
# Stop the proxy
pkill -f cli-proxy-api# Test the API
curl http://localhost:8317/v1/models
# Chat completion
curl http://localhost:8317/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "gpt-5.2", "messages": [{"role": "user", "content": "Hello"}]}'from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8317/v1",
api_key="dummy"
)
response = client.chat.completions.create(
model="gpt-5.2",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)Cause: The script cannot find your API credentials.
Solution:
# Check if credentials exist
ls -la ~/.codex/
cat ~/.codex/config.tomlCause: Another process is using the port.
Solution:
# Find what's using the port
lsof -i :8317
# Kill the process
kill <PID>
# Restart
./setup.shCause: Factory CLI configuration may be incorrect.
Solution:
# Check configuration
cat ~/.factory/config.json
# Re-run setup
./setup.shCause: Network issues or incorrect API endpoint.
Solution:
# Check proxy logs
tail -50 CLIProxyAPI/proxy.log
# Test API endpoint
curl http://localhost:8317/v1/modelsIf you encounter issues not covered above, you can ask any AI assistant for help:
- Copy this entire README file
- Paste it to ChatGPT, Claude, or any AI assistant
- Describe your problem
The AI will understand the project structure and help you troubleshoot.
Q: Do I need to keep the proxy running?
A: Yes. The proxy must be running for Droid or other apps to connect. Run ./setup.sh to start it.
Q: Can I change the port?
A: Yes. Edit CLIProxyAPI/config.yaml and change port: 8317 to your preferred port.
Q: Is my API key secure?
A: Yes. The key is stored locally in ~/.cli-proxy-api/ with restricted permissions (600). It never leaves your machine.
Q: Can I use this with Claude/Anthropic API? A: This proxy is designed for OpenAI-compatible APIs. For Anthropic, you'd need a different setup.
.
├── setup.sh # One-click setup script
├── sync-credentials.sh # Credential synchronization
├── verify-integration.sh # Configuration verification
├── CLIProxyAPI/
│ ├── cli-proxy-api # Proxy server binary
│ └── config.yaml.example # Configuration template
├── README.md # English documentation
├── README_zh-CN.md # Chinese documentation
├── CONTRIBUTING.md # Contribution guidelines
└── LICENSE # MIT License
| File | Purpose |
|---|---|
~/.factory/config.json |
Droid custom model settings |
~/.cli-proxy-api/codex-yunyi.json |
Proxy authentication token |
CLIProxyAPI/config.yaml |
Proxy server configuration |
This project uses the following open-source tools:
- CLIProxyAPI - The core proxy server component
- Factory CLI (Droid) - The CLI tool this integrates with
Special thanks to all contributors and the open-source community.
Contributions are welcome! Please read CONTRIBUTING.md for guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.