Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add docs for annotation queue rubrics #568

Merged
merged 1 commit into from
Dec 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 18 additions & 1 deletion docs/evaluation/how_to_guides/annotation_queues.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,28 @@ While you can always [annotate runs inline](./annotate_traces_inline), annotatio
To create an annotation queue, navigate to the **Annotation queues** section through the homepage or left-hand navigation bar.
Then click **+ New annotation queue** in the top right corner.

![](./static/annotation_queue_form.png)
![](./static/create_annotation_queue_new.png)

### Basic Details

Fill in the form with the **name** and **description** of the queue.
You can also assign a **default dataset** to queue, which will streamline the process of sending the inputs and outputs of certain runs to datasets in your LangSmith workspace.

### Annotation Rubric

First, type some high-level instructions for your annotators, which will be shown in the sidebar on every run.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the word 'type' messed me up

Maybe "Begin by drafting some high-level"...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops already merged will merge another one, i agree


Next, click "+ Desired Feedback" to add feedback keys to your annotation queue. Annotators will be presented with these feedback keys on each run.
Add a description for each, as well as a short description of each category if the feedback is categorical.

![annotation queue rubric](./static/create_annotation_rubric.png)

Reviewers will see this:

![rubric for annotators](./static/rubric_for_annotators.png)

### Collaborator Settings

There are a few settings related to multiple annotators:

- **Number of reviewers per run**: This determines the number of reviewers that must mark a run as "Done" for it to be removed from the queue. If you check "All workspace members review each run," then a run will remain in the queue until all workspace members have marked it "Done".
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading