Skip to content

Getting Started

Welcome to Candela! This guide will help you get up and running with LLM observability in minutes.

Candela is an open-source observability platform that gives you full visibility into your LLM traffic. It consists of four components:

candela-local

Developer proxy + runtime manager — run local & cloud models with observability from your machine.

candela-server

Full backend — API server, LLM proxy, dashboard UI, deployed on Cloud Run.

candela-sidecar

Lightweight production proxy (< 5MB) for containers — traces, cost, budget enforcement.

candela-desktop

Flutter desktop app for managing providers and viewing traces.

Use candela-local for instant LLM observability on your machine — local models via Ollama, cloud models via ADC, all traced to SQLite.

Terminal window
go install github.com/candelahq/candela/cmd/candela-local@latest

candela-local docs →