Skip to content

Overview Dashboard#

The Overview dashboard displays aggregate metrics for your project, helping you track trends in usage, costs, and performance over time.


Accessing Overview#

  1. Select a project in the Navigator
  2. Click Overview in the project menu

Dashboard Charts#

Requests#

Track the volume of LLM calls over time:

  • Total requests — Number of API calls made
  • Success rate — Percentage of successful requests
  • Error breakdown — Types of errors encountered

Use this to identify:

  • Usage spikes (debugging runaway loops)
  • Error patterns (rate limiting, API issues)
  • Traffic trends (growth over time)

Costs#

Monitor your LLM spending:

Metric Description
Total Cost Estimated spend for the period
Cost by Model Breakdown by model used
Cost Trend Daily/weekly cost changes

Cost Estimation

Costs are estimated based on published model pricing. Actual costs may vary based on your provider agreement.


Token Usage#

Understand your token consumption:

Metric Description
Input Tokens Tokens sent in prompts
Output Tokens Tokens in responses
Average per Request Mean tokens per call

This helps you:

  • Optimize prompts (reduce input tokens)
  • Control response length (manage output tokens)
  • Predict costs (tokens directly impact pricing)

Latency#

Monitor response times:

Metric Description
Average Latency Mean response time
P50 / P95 / P99 Latency percentiles
Latency Distribution Histogram of response times

Use this to:

  • Identify slow requests
  • Set realistic timeouts
  • Compare model performance

Time Filters#

Adjust the time range using the filter in the upper-left corner:

Option Range
Last Hour Rolling 60 minutes
Last 24 Hours Rolling day
Last 7 Days Rolling week
Last 30 Days Rolling month
Custom Range Pick start and end dates

Charts automatically update when you change the time filter.


Filtering by Labels#

If you use X-Loop-Custom-Label headers, you can filter the dashboard to show metrics for specific labels:

  1. Click the Labels dropdown
  2. Select one or more labels
  3. Charts update to show only matching traces

This is useful for comparing:

  • Different experiments or A/B tests
  • Features within the same project
  • User segments or environments

Exporting Data#

Export dashboard data for reporting or further analysis:

  1. Click Export in the dashboard header
  2. Choose format:
    • CSV — For spreadsheets
    • JSON — For programmatic processing
  3. Select time range and metrics to include

Next Steps#

  • Drill Into Details


    View individual traces behind the metrics.

    Traces

  • Share with Team


    Set up a remote environment for team-wide visibility.

    Add Environment