waytocc
FeaturesHow it works
Sign inGet started
FeaturesHow it works
Sign inGet started
Menu

One key.
Every model.

The modern gateway for AI APIs. Drop-in OpenAI compatibility for Claude, GPT, Gemini, and more — deployed in minutes, with full control over routing, cost, and observability.

Get startedView on GitHub

Open source · MIT · Self-host or use the managed edge

terminal
curl https://api.waytocc.com/v1/chat/completions \
  -H "Authorization: Bearer $WAYTOCC_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-6",
    "messages": [
      { "role": "user", "content": "Hello" }
    ]
  }'

Features

Built for teams shipping AI to production.

waytocc is the gateway you'd build yourself, if you had the time. Provider-agnostic, observable by default, operationally boring.

Universal API

One OpenAI-compatible endpoint speaks to Claude, GPT, Gemini, Mistral, and any local model. Stop maintaining per-vendor SDKs.

Smart routing

Route by cost, latency, or capability. Fall back across providers when one degrades. Define policies in plain config.

Built-in observability

Per-key usage, request traces, token spend, error rates — without bolting on a separate tool.

Quotas & rate limits

Bound spend per team, per key, per model. Hard caps prevent surprise bills; soft caps trigger alerts.

Response caching

Hash-based cache for deterministic prompts. Cut cost and latency on retries, replays, and golden tests.

Self-hosted, MIT

Run on your own infrastructure or use the managed edge. Same binary, same API. Zero vendor lock-in.

How it works

From zero to multi-provider in three steps.

  1. 01

    Point your SDK

    Set the base URL of your existing OpenAI client to waytocc. No code changes beyond one config line.

  2. 02

    Pick a model

    Use any provider's model ID — Claude, GPT-4, Gemini Pro, your local Llama. waytocc handles the translation.

  3. 03

    Ship to production

    Get logs, usage, and cost dashboards out of the box. Adjust routing without redeploying clients.

Drop-in compatible

One client. Every provider.

waytocc speaks the OpenAI wire format. Your existing tooling — SDKs, eval harnesses, prompt frameworks — keeps working unchanged. Switch providers by changing one string.

  • OpenAI Node, Python, and Go SDKs
  • Streaming, function calling, vision
  • Anthropic & Google native passthrough
app.ts
import OpenAI from "openai"

const client = new OpenAI({
  baseURL: "https://api.waytocc.com/v1",
  apiKey: process.env.WAYTOCC_KEY,
})

// Same client. Any provider.
const claude = await client.chat.completions.create({
  model: "claude-sonnet-4-6",
  messages: [{ role: "user", content: "Hi" }],
})

const gpt = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hi" }],
})

Stop wiring providers. Start shipping.

A free key gets you the entire model catalog. No card required to try, no per-provider agreements to sign.

Create a key5-minute quickstart
waytocc

The modern gateway for AI APIs.

Product

  • Features
  • How it works

Account

  • Sign in
  • Get started

Source

  • GitHub

© 2026 waytocc. All rights reserved.