Analyze Models

The analytics view shows you metrics and stats about how your application is performing.

Analyze your model configurations with Gantry to:

  1. View metrics like latency and spend, as well as derived metrics (called Projections in Gantry) like completion length, sentiment, toxicity, fluidity, language, and coverage.
  2. Easily filter and dig in to interesting results. Add these results to a dataset so they don't get missed by future test coverage.
  3. Visually compare two queries of your data.

Analytics Dashboard

Comparing configuration versions 1 and 2

Comparing configuration versions 1 and 2

If you find an interesting case where your model is not performing as expected, you can add it directly to a dataset to be monitored in a future evaluation. Click "View data" or scroll down to see the examples that have been logged by your users as they interact with the model. Select the rows of interest and click "Add to dataset" and select your existing dataset (or create a new one). Now, every time you change your application and run an evaluation, that evaluation will run on the production data that caused your models problems in the past.