Observability
Observability
Section titled “Observability”Both runtimes provide observability features for production deployments.
Rust: Structured Logging
Section titled “Rust: Structured Logging”ai-lib-rust uses the tracing ecosystem:
use tracing_subscriber;
// Enable loggingtracing_subscriber::init();
// All AI-Lib operations emit structured log eventslet client = AiClient::from_model("openai/gpt-4o").await?;Log levels:
INFO— Request/response summariesDEBUG— Protocol loading, pipeline stagesTRACE— Individual frames, JSONPath matches
Rust: Call Statistics
Section titled “Rust: Call Statistics”Every request returns usage statistics:
let (response, stats) = client.chat() .user("Hello") .execute_with_stats() .await?;
println!("Model: {}", stats.model);println!("Provider: {}", stats.provider);println!("Prompt tokens: {}", stats.prompt_tokens);println!("Completion tokens: {}", stats.completion_tokens);println!("Total tokens: {}", stats.total_tokens);println!("Latency: {}ms", stats.latency_ms);Python: Metrics (Prometheus)
Section titled “Python: Metrics (Prometheus)”from ai_lib_python.telemetry import MetricsCollector
metrics = MetricsCollector()
client = await AiClient.builder() \ .model("openai/gpt-4o") \ .metrics(metrics) \ .build()
# After some requests...prometheus_text = metrics.export_prometheus()Tracked metrics:
ai_lib_requests_total— Request count by model/providerai_lib_request_duration_seconds— Latency histogramai_lib_tokens_total— Token usage by typeai_lib_errors_total— Error count by type
Python: Distributed Tracing (OpenTelemetry)
Section titled “Python: Distributed Tracing (OpenTelemetry)”from ai_lib_python.telemetry import Tracer
tracer = Tracer( service_name="my-app", endpoint="http://jaeger:4317",)
client = await AiClient.builder() \ .model("openai/gpt-4o") \ .tracer(tracer) \ .build()Traces include spans for:
- Protocol loading
- Request compilation
- HTTP transport
- Pipeline processing
- Event mapping
Python: Health Monitoring
Section titled “Python: Health Monitoring”from ai_lib_python.telemetry import HealthChecker
health = HealthChecker()status = await health.check()
print(f"Healthy: {status.is_healthy}")print(f"Details: {status.details}")Python: User Feedback
Section titled “Python: User Feedback”Collect feedback on AI responses:
from ai_lib_python.telemetry import FeedbackCollector
feedback = FeedbackCollector()
# After getting a responsefeedback.record( request_id=stats.request_id, rating=5, comment="Helpful response",)Resilience Observability
Section titled “Resilience Observability”Monitor circuit breaker and rate limiter state:
// Rustlet state = client.circuit_state(); // Closed, Open, HalfOpenlet inflight = client.current_inflight();# Pythonsignals = client.signals_snapshot()print(f"Circuit: {signals.circuit_state}")print(f"Inflight: {signals.current_inflight}")