API Reference

Welcome to the Mira AI Proxy API. Our gateway is fully compatible with OpenAI and Anthropic specifications. Use one single endpoint to access a balanced fleet of Mira workpsaces.

Authentication

All API requests must include your proxy API key. You can specify it using either the standard Authorization header or the x-api-key header.

Authorization: Bearer YOUR_PROXY_KEY
-- OR --
x-api-key: YOUR_PROXY_KEY

Base URL

The Proxy listens on all interfaces. By default, use the following base path for all v1 requests:

http://localhost:4000/v1

Chat Completions

OpenAI compatible endpoint for generating chat responses. Supports streaming and tool-calling.

POST /v1/chat/completions

Standard payload format:

{
  "model": "provider/model-id", // e.g. "anthropic/claude-3-5-sonnet"
  "messages": [
    { "role": "user", "content": "Hello!" }
  ],
  "stream": true
}

Messages

Anthropic-native compatible endpoint. Mira Proxy automatically converts these to Mira requests while maintaining the message structure.

POST /v1/messages
{
  "model": "anthropic/claude-3-5-sonnet",
  "messages": [...],
  "max_tokens": 1024
}

Models List

Retrieves a list of all available models across your active node fleet.

GET /v1/models

Web Fetch

Secure HTTP proxy for fetching website content as markdown or raw HTML, optimized for LLM reading.

POST /v1/fetch
{ "url": "https://example.com" }

Media & Speech Generation

Access Mira's image, video, and text-to-speech generation models through a unified proxy. Costs are automatically deducted based on upstream charges.

POST /v1/image
{ "prompt": "A futuristic cyberpunk city" }
POST /v1/video
{ "prompt": "A flying dragon over mountains" }
POST /v1/speech
{ "text": "Hello, this is a test AI voice speaking." }

File Uploads

Securely upload files to core storage. This endpoint is free (0 deductions).

POST /v1/upload
{
  "data": "iVBORw0KGgo...",
  "mime_type": "image/png"
}

Use with OpenCode

OpenCode is an open-source AI coding agent that supports custom OpenAI-compatible providers. Connect it to Mira AI Proxy to use 200+ models billed from your workspace credits.

Note: This requires a Mira API key. Get yours from your dashboard.

Step 1 — Get your API key

  1. Login to your Mira portal
  2. Go to the dashboard and copy your proxy API key.

Step 2 — Set the environment variable

export MIRA_API_KEY="sk-your_key_here"
source ~/.zshrc

Step 3 — Configure OpenCode

Add to ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "mira": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Mira AI Proxy",
      "options": {
        "baseURL": "http://localhost:4000/v1",
        "apiKey": "{env:MIRA_API_KEY}"
      },
      "models": {
        "anthropic/claude-sonnet-4.5": { "name": "Claude Sonnet 4.5" },
        "anthropic/claude-haiku-4.5": { "name": "Claude Haiku 4.5" },
        "openai/gpt-5.1": { "name": "GPT-5.1" },
        "google/gemini-3-flash": { "name": "Gemini 3 Flash" }
      }
    }
  }
}

Step 4 — Select a model

/models

To set a default: { "model": "mira/anthropic/claude-sonnet-4.5" }

Troubleshooting

Models don't appear — Restart OpenCode after editing config.

Auth errors — Test with a quick curl:

curl http://localhost:4000/v1/models -H "Authorization: Bearer $MIRA_API_KEY"

Extras

Additional endpoints for advanced features like credit checks or node health.

Path Method Description
/v1/extras/token-count POST Calculate token count for a text blob without a full request.