3 prompts
Set up a complete observability stack. Structured logging, Prometheus metrics, distributed tracing with OpenTelemetry — correlated with trace IDs.
Track LLM API costs across multiple providers. Token counting, cost calculation, budget alerts, and per-feature usage attribution.
Craft prometheus alert rules with this structured code prompt. Adjustable variables let you fine-tune the output.