candela-local
Developer proxy + runtime manager — run local & cloud models with observability from your machine.
Welcome to Candela! This guide will help you get up and running with LLM observability in minutes.
Candela is an open-source observability platform that gives you full visibility into your LLM traffic. It consists of four components:
candela-local
Developer proxy + runtime manager — run local & cloud models with observability from your machine.
candela-server
Full backend — API server, LLM proxy, dashboard UI, deployed on Cloud Run.
candela-sidecar
Lightweight production proxy (< 5MB) for containers — traces, cost, budget enforcement.
candela-desktop
Flutter desktop app for managing providers and viewing traces.
Use candela-local for instant LLM observability on your machine — local models via Ollama, cloud models via ADC, all traced to SQLite.
go install github.com/candelahq/candela/cmd/candela-local@latestDeploy candela-server on Cloud Run with BigQuery storage, Firebase auth, and a dashboard UI. Developers connect via candela-local in Team Mode.
Drop candela-sidecar next to your application containers for zero-touch LLM tracing in production.
Integrate Candela with your ADK agents for end-to-end distributed tracing with W3C Trace Context propagation.