Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ async-stream = "0.3"
tracing = "0.1.41"
tracing-subscriber = { version = "0.3.19", features = ["fmt", "env-filter"] }
config = { version = "0.14.0", features = ["yaml"] }
minijinja = "2.11.0"
158 changes: 132 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,64 +1,170 @@
# CCORP - Anthropic to OpenAI/OpenRouter Adapter

This is a Rust application that acts as an adapter between the Anthropic API format and the OpenAI/OpenRouter API format. It spins up a webserver, receives requests in the Anthropic format, rewrites them to the OpenAI/OpenRouter format, sends them to OpenRouter, and streams the results back.
## Use Claude Code with any OpenRouter model.

## Installation via Cargo
CCORP (Claude Code OpenRouter Proxy) is a high-performance Rust application that acts as an adapter between the Anthropic API format and the OpenAI/OpenRouter API format. It provides a seamless bridge for applications expecting Anthropic's API to work with OpenRouter's extensive model collection.

### Prerequisites
## Features

- **API Translation**: Converts Anthropic API requests to OpenAI/OpenRouter format and vice versa
- **Streaming Support**: Full support for both streaming and non-streaming API calls
- **Model Mapping**: Flexible configuration to map Claude models to any OpenRouter-supported model
- **Web UI**: Built-in web interface for easy model switching at runtime
- **Request Logging**: Optional logging of all requests and responses for debugging

## Installation

![asset/image.jpg](assets/image.jpg)

### Via Cargo

#### Prerequisites
- Rust and Cargo: [https://www.rust-lang.org/tools/install](https://www.rust-lang.org/tools/install)

```
```bash
cargo install --git https://github.com/terhechte/CCORP --bin ccor
```

## Installation
### Via Releases

Download from the releases
Download the latest binary from the [releases page](https://github.com/terhechte/CCORP/releases).

## Configuration

1. Create a `.env` file in the root of the project.
2. Add the following environment variables to the `.env` file:
### Step 1: Environment Setup

Create a `.env` file in the root directory with your OpenRouter API key:

```env
OPENROUTER_API_KEY=your_openrouter_api_key_here
```

```
OPENROUTER_MODEL_HAIKU=mistralai/devstral-small # or another model
OPENROUTER_MODEL_SONNET=mistralai/devstral-small # or another model
OPENROUTER_MODEL_OPUS=mistralai/devstral-small # or another model
```
### Step 2: Model Configuration

Create a `config.json` file to configure the port and model mappings:

```json
{
"port": 3000,
"models": {
"haiku": "mistralai/mistral-7b-instruct",
"sonnet": "meta-llama/llama-3.2-90b-vision-instruct",
"opus": "openai/gpt-4o"
}
}
```

You can map Claude models to any model available on OpenRouter.

## Running the Application

To run the application, use the following command:
### Basic Usage

```bash
cargo run
```

The server will start on `0.0.0.0:3000`.
The server will start on `0.0.0.0:3000` (or the port specified in `config.json`).

### With Request Logging

To enable logging of all requests and responses:

```bash
cargo run -- --logging logs
```

This creates timestamped JSON files in the `logs` directory for each request/response pair.

## Using Claude Code
## Using with Claude Code CLI

Start the proxy according to the docs which will run it in localhost:3073
CCORP is designed to work seamlessly with Anthropic's Claude Code CLI:

export ANTHROPIC_BASE_URL=http://localhost:3073
1. Start CCORP (it will run on port 3000 by default)
2. Set environment variables:

export ANTHROPIC_AUTH_TOKEN="your openrouter api key"
```bash
export ANTHROPIC_BASE_URL=http://localhost:3000
export ANTHROPIC_AUTH_TOKEN="your_openrouter_api_key"
```

run claude code
3. Run Claude Code as normal:

## Logging
```bash
claude
```

CCORP can also log requests and responses to a specified directory. To enable this, pass the `--logging` flag followed by the path to the directory where you want the logs to be stored.
## Web UI for Model Management

For example, to log requests and responses to the `logs` directory, run the following command:
CCORP includes a web interface for dynamically switching models without restarting the server.

Visit `http://localhost:3000/switch-model` in your browser to:
- View all available OpenRouter models
- Change model mappings for Haiku, Sonnet, and Opus

Changes are saved to `config.json` and take effect immediately.

## API Usage Examples

### Non-Streaming Request

```bash
cargo run --logging logs
curl -X POST http://localhost:3000/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: your_openrouter_api_key" \
-d '{
"model": "claude-3-5-haiku-20241022",
"messages": [{"role": "user", "content": "Hello, world!"}]
}'
```

This will start the server and log requests and responses to the `logs` directory.
### Streaming Request

```bash
curl -X POST http://localhost:3000/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: your_openrouter_api_key" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"messages": [{"role": "user", "content": "Tell me a story"}],
"stream": true
}'
```

## Architecture

CCORP is built with:
- **Rust**: For high performance and memory safety
- **Axum**: Modern async web framework
- **Tokio**: Async runtime
- **Minijinja**: Template engine for the web UI

The request flow:
1. Receive Anthropic-formatted request
2. Map Claude model to configured OpenRouter model
3. Transform request to OpenAI format
4. Forward to OpenRouter
5. Transform response back to Anthropic format
6. Stream or return to client

## Development

### Building

```bash
cargo build --release
```

### Running Tests

```bash
cargo test
```

### Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

## License

CCORP is licensed under the MIT License.
CCORP is licensed under the MIT License. See [LICENSE](LICENSE) for details.
Binary file added assets/images.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 8 additions & 0 deletions config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{
"port": 3332,
"models": {
"haiku": "deepseek/claude-haiku",
"sonnet": "anthropic/claude-sonnet-4",
"opus": "anthropic/claude-opus-4"
}
}
12 changes: 6 additions & 6 deletions src/anthropic_to_openai.rs
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
use crate::config::Config;
use crate::models::*;
use crate::settings::Settings;
use serde_json::json;

pub fn map_model(anthropic_model: &str, settings: &Settings) -> String {
pub fn map_model(anthropic_model: &str, settings: &Config) -> String {
if anthropic_model.contains("haiku") {
settings.openrouter_model_haiku.clone()
settings.model_haiku.clone()
} else if anthropic_model.contains("sonnet") {
settings.openrouter_model_sonnet.clone()
settings.model_sonnet.clone()
} else if anthropic_model.contains("opus") {
settings.openrouter_model_opus.clone()
settings.model_opus.clone()
} else {
anthropic_model.to_string()
}
}

pub fn format_anthropic_to_openai(req: AnthropicRequest, settings: &Settings) -> OpenAIRequest {
pub fn format_anthropic_to_openai(req: AnthropicRequest, settings: &Config) -> OpenAIRequest {
let mut openapi_messages = Vec::new();

if let Some(system) = req.system {
Expand Down
84 changes: 84 additions & 0 deletions src/config.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
use dotenvy::dotenv;
use serde::Deserialize;
use serde::Serialize;
use std::env;
use std::fs;

/// TOML configuration structure
#[derive(Deserialize, Serialize)]
struct JsonConfig {
port: u16,
models: ModelConfig,
}

#[derive(Deserialize, Serialize)]
struct ModelConfig {
haiku: String,
sonnet: String,
opus: String,
}

/// Runtime configuration loaded from environment variables.
#[derive(Clone, Debug)]
pub struct Config {
/// The port to listen on
pub port: u16,
/// Base URL for the OpenRouter API (e.g., https://openrouter.ai/api/v1)
pub base_url: String,
/// API key for authenticating with OpenRouter
pub api_key: String,
/// Override model name for Claude 3.5 Haiku
pub model_haiku: String,
/// Override model name for Claude Sonnet 4
pub model_sonnet: String,
/// Override model name for Claude Opus 4
pub model_opus: String,
}

impl Config {
/// Load configuration from `config.json` and `.env` file.
pub fn from_env() -> Self {
// Load environment variables from .env file
dotenv().ok();

// Load API key from environment (must be present)
let api_key = env::var("OPENROUTER_API_KEY")
.expect("Environment variable OPENROUTER_API_KEY must be set");

// Load config.json
let config: JsonConfig = serde_json::from_str(
&fs::read_to_string("config.json").expect("Could not read config.json file"),
)
.expect("Could not read config.json file");

Config {
port: config.port,
base_url: default_openrouter_base_url(),
api_key,
model_haiku: config.models.haiku,
model_sonnet: config.models.sonnet,
model_opus: config.models.opus,
}
}

/// Write configuration to `config.json` (excluding secrets like api_key).
pub fn write(&self) {
let config_out = JsonConfig {
port: self.port,
models: ModelConfig {
haiku: self.model_haiku.clone(),
sonnet: self.model_sonnet.clone(),
opus: self.model_opus.clone(),
},
};

let json_string =
serde_json::to_string_pretty(&config_out).expect("Failed to serialize configuration");

fs::write("config.json", json_string).expect("Failed to write config.json");
}
}

fn default_openrouter_base_url() -> String {
"https://openrouter.ai/api/v1".to_string()
}
Loading
Loading