Claude as your
OpenAI-compatible backend

Conduit is an API server that lets you use Claude as a drop-in model provider for any OpenAI-compatible tool. One binary, one command, and Claude works everywhere.

Terminal
# Start the server
$ conduit
# conduit listening on http://127.0.0.1:3456

# Point any OpenAI-compatible client at it
$ export OPENAI_BASE_URL=http://127.0.0.1:3456/v1
$ python my_agent.py
# Claude is now your model provider

What you get

Drop-in compatible

Serves the OpenAI Chat Completions API. Any tool that works with OpenAI works with Conduit.

Persistent sessions

Maintains model sessions for fast resumes with cached context. No wasted tokens.

Streaming support

Full Server-Sent Events streaming, identical to the OpenAI API format.

Multi-turn aware

Handles agentic tool loops and manages session isolation automatically.

Single binary

No dependencies, no containers, no configuration files. One binary, one command.

All Claude models

Opus, Sonnet, and Haiku. Automatic model alias resolution built in.

Simple pricing

One-time payment, no subscription. Use on up to 3 machines.

$36
one-time payment
  • One-time payment, no subscription
  • Up to 3 machines per license
  • All Claude models supported
  • Full streaming and multi-turn support
Buy Conduit

This is beta software. We are actively developing Conduit and committed to supporting early adopters. Your purchase grants a license for the current version.