Drop-in LLM observability proxy.
Drop-in LLM observability proxy. Get traces, costs, latencies, and caching with a one-line base-URL change.
Vendor-neutral standard for traces, metrics, and logs.
ObservabilityThe dashboarding layer for metrics, logs, and traces.
ObservabilityOpen-source LLM observability and prompt management. Self-hostable.
ObservabilityCLI for prompt eval and red-teaming.
ObservabilityUpdates from the AI world — what shipped, what we’re using in production, and what’s worth your attention. Two emails a month, no spam.