OpenTelemetry Observability
Compliance Scanner exports traces and logs via OpenTelemetry Protocol (OTLP) for integration with observability platforms like SigNoz, Grafana (Tempo + Loki), Jaeger, and others.
Enabling
Set the OTEL_EXPORTER_OTLP_ENDPOINT environment variable to enable OTLP export:
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317When this variable is not set, telemetry export is disabled and only console logging is active.
What Is Exported
Traces
Distributed traces for:
- HTTP request handling (via
tower-httpTraceLayer) - Database operations
- Scan pipeline phases
- External API calls (LiteLLM, Keycloak, Git providers)
Logs
All tracing::info!, tracing::warn!, tracing::error! log events are exported as OTel log records, including structured fields.
Configuration
| Variable | Description | Default |
|---|---|---|
OTEL_EXPORTER_OTLP_ENDPOINT | Collector gRPC endpoint | (disabled) |
OTEL_SERVICE_NAME | Service name in traces | compliance-agent or compliance-dashboard |
RUST_LOG | Log level filter | info |
Docker Compose Setup
The included docker-compose.yml provides an OTel Collector service:
otel-collector:
image: otel/opentelemetry-collector-contrib:latest
ports:
- "4317:4317" # gRPC
- "4318:4318" # HTTP
volumes:
- ./otel-collector-config.yaml:/etc/otelcol-contrib/config.yamlThe agent and dashboard are pre-configured to send telemetry to the collector:
agent:
environment:
OTEL_EXPORTER_OTLP_ENDPOINT: http://otel-collector:4317
OTEL_SERVICE_NAME: compliance-agent
dashboard:
environment:
OTEL_EXPORTER_OTLP_ENDPOINT: http://otel-collector:4317
OTEL_SERVICE_NAME: compliance-dashboardCollector Configuration
Edit otel-collector-config.yaml to configure your backend. The default exports to debug (stdout) only.
SigNoz
exporters:
otlp/signoz:
endpoint: "signoz-otel-collector:4317"
tls:
insecure: true
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp/signoz]
logs:
receivers: [otlp]
processors: [batch]
exporters: [otlp/signoz]Grafana Tempo (Traces) + Loki (Logs)
exporters:
otlp/tempo:
endpoint: "tempo:4317"
tls:
insecure: true
loki:
endpoint: "http://loki:3100/loki/api/v1/push"
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp/tempo]
logs:
receivers: [otlp]
processors: [batch]
exporters: [loki]Jaeger
exporters:
otlp/jaeger:
endpoint: "jaeger:4317"
tls:
insecure: true
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp/jaeger]Verifying
After starting with telemetry enabled, look for this log on startup:
OpenTelemetry OTLP export enabled endpoint=http://otel-collector:4317 service=compliance-agentIf the endpoint is unreachable, the application still starts normally — telemetry export fails silently without affecting functionality.