Measure the performance of your application#

Gantry allows you to compute several quantities that help measure the performance of your machine learning application, including model performance metrics, distribution shift metrics, and projections. Gantry also allows you to define your own projections and deploy them to run on Gantry.

Compute model performance metrics#

If you have ingested feedback or labels about your model’s performance, then you can visualize model performance metrics like accuracy, mean squared error and other metrics that come out of the box in the dashboard or sdk.

To view these metrics in the dashboard, look in the Performance section of the Overview page.

Visualizing performance in the dashboard#

To visualize metrics in the dashboard, go to the timeline view. Click the + button and select (or type) the metric you want to compute and the field(s) you want to compute it on. There are many metrics available, including accuracy_score, mean_squared_error, f1_score, precision_score, recall_score, etc.

Create Metric

Gantry will automatically create favorited metrics (visible at the bottom of the new chart creator) based on the data types of your predictions and feedback. Favorited metrics are convenient for easy 1-click creation of a chart users might be interested in.

Type of prediction

Type of feedback

Favorited metric

Float

Float

mean_squared_error

Boolean

Boolean

accuracy_score

String

String

accuracy_score

Int

Int

accuracy_score

Float

Bool

roc_auc_score

To add or remove an existing metric from your favorites, click the 3-dots button on the right hand side of a chart and select “Add to favorites” to add or “Remove from favorites” to remove.

Favoriting A Metric

Compute distribution shift metrics#

Distribution shift metrics are currently supported in the SDK and are coming to the dashboard soon.