Rust Runtime
for AI-Protocol.
High-performance, protocol-driven AI client. Zero hardcoded provider logic, operator-based streaming pipeline, compile-time type safety, and sub-millisecond overhead.
ai-lib = "0.6.6" Key Features
Operator-Based Pipeline
Streaming responses flow through composable operators: Decoder → Selector → Accumulator → FanOut → EventMapper. Each stage is protocol-configured.
Protocol Loading
Loads provider manifests from local files, environment variables, or GitHub fallback. Supports hot-reload via file watching. Zero restart needed for config updates.
Resilience Patterns
Built-in circuit breaker, token bucket rate limiter, exponential backoff retry, and max-inflight backpressure. All configurable via environment variables.
Embeddings & Vectors
EmbeddingClient with vector operations — cosine similarity, Euclidean distance, dot product. Build semantic search and RAG applications natively.
Cache & Batch
Response caching with TTL (memory backend). Batch execution with configurable concurrency, timeout, and multiple processing strategies.
Plugin System
Extensible plugin architecture with hooks and middleware chain. Add custom behavior without modifying core code. Guardrails for content filtering and PII detection.
Simple, Unified API
The same code works across all 30+ providers. Just change the model identifier — the protocol manifest handles everything else: endpoint, auth, parameter mapping, streaming format.
The builder pattern provides a fluent API for request construction. Stream results arrive as unified StreamingEvent types regardless of the underlying provider.
use ai_lib::{AiClient, Message, StreamingEvent};
// Works with ANY provider — protocol-driven
let client = AiClient::from_model(
"anthropic/claude-3-5-sonnet"
).await?;
// Builder pattern for chat requests
let mut stream = client.chat()
.user("Explain AI-Protocol")
.temperature(0.7)
.max_tokens(1000)
.stream()
.execute_stream()
.await?;
// Unified streaming events
while let Some(event) = stream.next().await {
match event? {
StreamingEvent::ContentDelta { text, .. }
=> print!("{text}"),
StreamingEvent::StreamEnd { stats, .. }
=> println!("\nTokens: {}",
stats.total_tokens),
_ => {} // ToolCall, Metadata, etc.
}
} Internal Architecture
Five layers from user-facing API to HTTP transport. The streaming pipeline is the heart of the system.
Module Overview
client/
AiClient, AiClientBuilder, ChatRequestBuilder, execution logic, policy engine, preflight checks, error classification, CallStats, CancelHandle.
protocol/
ProtocolLoader (local/URL/GitHub), JSON Schema validator, ProtocolManifest structure, UnifiedRequest compilation, config types.
pipeline/
Decoder (SSE, JSON Lines), Selector (JSONPath), Accumulator (tool calls), FanOut (multi-candidate), EventMapper (unified events), Retry and Fallback operators.
transport/
HttpTransport (reqwest), API key resolution (keyring + env vars), proxy/timeout configuration, middleware support.
resilience/
Circuit breaker (open/half-open/closed), token bucket rate limiter, max-inflight semaphore backpressure.
embeddings/
EmbeddingClient, EmbeddingClientBuilder, vector operations (cosine similarity, Euclidean distance, dot product).
cache/ + batch/
CacheManager with TTL (MemoryCache, NullCache). BatchCollector and BatchExecutor with concurrency control and multiple strategies.
plugins/ + guardrails/
Plugin trait, PluginRegistry, HookManager, middleware chain. Guardrails with keyword/pattern filters and PII detection.