Model Predictions in Nucleus

By uploading model predictions to Nucleus, you can compare your predictions to ground truth annotations and discover problems with your models or dataset.

You can also upload predictions for unannotated data to enable curation and querying workflows. This can for instance help you identify the most effective subset of unlabeled data to label next.

Uploading Model Predictions

Prediction objects house the same information as Annotations, and can additionally contain model confidence and a PDF across each class in the taxonomy.

Within Nucleus, models work as follows:

  1. Create a Model. You can do this just once and reuse the model on multiple datasets.
  2. Construct Prediction objects.
  3. Upload them to your Dataset under your Model.
  4. Trigger calculation of evaluation metrics (if your Dataset has ground truth Annotations)

You'll then be able to debug your models against your ground truth qualitatively with queries and visualizations, or quantitatively with metrics, plots, and other insights. You can also compare multiple models that have been run on the same dataset.


What's Next

Get started by creating a Model!