Skip to content

Commit

Permalink
Re-add demo to show Count tokens (#296)
Browse files Browse the repository at this point in the history
* Add `GenerativeModel.count_tokens` demonstration

* Format notebook with nbfmt

* Update information about tokens

* Update python_quickstart.ipynb

* Update python_quickstart.ipynb

* nbfmt

---------

Co-authored-by: Mark McDonald <[email protected]>
  • Loading branch information
mayureshagashe2105 and markmcd authored Mar 14, 2024
1 parent 3cf5a79 commit c908a50
Showing 1 changed file with 58 additions and 0 deletions.
58 changes: 58 additions & 0 deletions site/en/tutorials/python_quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1034,6 +1034,64 @@
" display(to_markdown(f'**{message.role}**: {message.parts[0].text}'))"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "AEgVOYu0pAr4"
},
"source": [
"## Count tokens\n",
"\n",
"Large language models have a context window, and the context length is often measured in terms of the **number of tokens**. With the Gemini API, you can determine the number of tokens per any `glm.Content` object. In the simplest case, you can pass a query string to the `GenerativeModel.count_tokens` method as follows:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "eLjBmPCLpElk"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"total_tokens: 7"
]
}
],
"source": [
"model.count_tokens(\"What is the meaning of life?\")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "oM2_U8pmpHQA"
},
"source": [
"Similarly, you can check `token_count` for your `ChatSession`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "i0MUU4BZpG4_"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"total_tokens: 501"
]
}
],
"source": [
"model.count_tokens(chat.history)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down

0 comments on commit c908a50

Please sign in to comment.