Model Predictions in Nucleus
By uploading model predictions to Nucleus, you can compare your predictions to ground truth annotations and discover problems with your models or dataset.
You can also upload predictions for unannotated data to enable curation and querying workflows. This can for instance help you identify the most effective subset of unlabeled data to label next.
Uploading Model Predictions
Prediction
objects house the same information as Annotations
, and can additionally contain model confidence and a PDF across each class in the taxonomy.
Within Nucleus, models work as follows:
- Create a
Model
. You can do this just once and reuse the model on multiple datasets. - Construct
Prediction
objects. - Upload them to your
Dataset
under yourModel
. - Trigger calculation of evaluation metrics (if your
Dataset
has ground truthAnnotations
)
You'll then be able to debug your models against your ground truth qualitatively with queries and visualizations, or quantitatively with metrics, plots, and other insights. You can also compare multiple models that have been run on the same dataset.
Updated over 2 years ago