diff --git a/docs/evaluation/how_to_guides/annotation_queues.mdx b/docs/evaluation/how_to_guides/annotation_queues.mdx index 429615e2..f42f70ba 100644 --- a/docs/evaluation/how_to_guides/annotation_queues.mdx +++ b/docs/evaluation/how_to_guides/annotation_queues.mdx @@ -73,7 +73,7 @@ To assign runs to an annotation queue, either: 3. [Set up an automation rule](../../../observability/how_to_guides/monitoring/rules) that automatically assigns runs which pass a certain filter and sampling condition to an annotation queue. -4. Select one or multiple experiments from the dataset page that you want a human to review and click "Annotate". From the resulting popup, you may either create a new queue or add to an existing one: +4. Select one or multiple experiments from the dataset page and click **Annotate**. From the resulting popup, you may either create a new queue or add the runs to an existing one: ![](./static/annotate_experiment.png) :::tip