The Specification
That Drives Everything.
AI-Protocol separates "what to do" from "how to do it." Provider manifests declare endpoints, auth, parameter mappings, streaming decoders, and error handling — all in YAML, all validated by JSON Schema.
What's Inside
Core Specification
Defines standard parameters (temperature, max_tokens), streaming events (PartialContentDelta, ToolCallStarted), error classes (13 types), and retry policies.
30+ Provider Manifests
Each YAML file declares a provider's endpoint, auth, parameter mappings, SSE decoder config, error classification, rate limit headers, and capabilities.
Model Registry
Model instances with provider references, context windows, capability flags, and per-token pricing. GPT, Claude, Gemini, DeepSeek, Qwen, and more.
JSON Schema Validation
JSON Schema 2020-12 definitions validate every manifest. CI pipelines ensure configuration correctness. Zero runtime surprises.
A Provider Manifest
Each provider is described by a YAML manifest. It declares everything a runtime needs to communicate with the provider — endpoint, authentication, parameter mapping, streaming decoder, error handling, and capabilities.
Runtimes read these manifests and "compile" user requests into provider-specific HTTP calls. No if provider == "openai" branches anywhere.
- Endpoint & Auth — Base URL, protocol, bearer tokens, API key headers
- Parameter Mapping — Standard names to provider-specific JSON fields
- Streaming Decoder — SSE/NDJSON format, JSONPath event extraction rules
- Error Classification — HTTP status codes to 13 standard error types
id: anthropic
protocol_version: "1.5"
endpoint:
base_url: "https://api.anthropic.com/v1"
chat_path: "/messages"
auth:
type: bearer
token_env: "ANTHROPIC_API_KEY"
parameter_mappings:
temperature: "temperature"
max_tokens: "max_tokens"
stream: "stream"
streaming:
decoder:
format: "anthropic_sse"
event_map:
- match: "$.type == 'content_block_delta'"
emit: "PartialContentDelta"
error_classification:
by_http_status:
"429": "rate_limited"
"401": "authentication"
capabilities:
streaming: true
tools: true
vision: true Where Protocol Fits
AI-Protocol is the foundation layer. Runtimes consume it. Applications consume runtimes.
Supported Providers
Each provider has a complete YAML manifest with endpoint, auth, parameter mappings, streaming decoder, error handling, and capability flags.
Explore the Protocol
Read the specification, browse provider manifests, or contribute a new provider.