Skip to content

Commit

Permalink
add docs for annotating an experiment (#599)
Browse files Browse the repository at this point in the history
  • Loading branch information
samnoyes authored Dec 18, 2024
2 parents ebfe7b8 + a600492 commit c58d7e8
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions docs/evaluation/how_to_guides/annotation_queues.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,9 @@ To assign runs to an annotation queue, either:

3. [Set up an automation rule](../../../observability/how_to_guides/monitoring/rules) that automatically assigns runs which pass a certain filter and sampling condition to an annotation queue.

4. Select one or multiple experiments from the dataset page and click **Annotate**. From the resulting popup, you may either create a new queue or add the runs to an existing one:
![](./static/annotate_experiment.png)

:::tip

It is often a very good idea to assign runs that have a certain user feedback score (eg thumbs up, thumbs down) from the application to an annotation queue. This way, you can identify and address issues that are causing user dissatisfaction.
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit c58d7e8

Please sign in to comment.