4 prompts
Set up a complete observability stack. Structured logging, Prometheus metrics, distributed tracing with OpenTelemetry — correlated with trace IDs.
Track LLM API costs across multiple providers. Token counting, cost calculation, budget alerts, and per-feature usage attribution.
A advanced-level code prompt for creating monitoring dashboard. Fill in the variables to customize the output for your specific needs.
Craft prometheus alert rules with this structured code prompt. Adjustable variables let you fine-tune the output.