Skip to content

Commit

Permalink
Update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
schorndorfer committed Nov 14, 2023
1 parent dad052e commit 5b66d64
Show file tree
Hide file tree
Showing 11 changed files with 430 additions and 570 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _images/testing.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
17 changes: 17 additions & 0 deletions _sources/analyze-text.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -658,6 +658,23 @@
"```\n",
":::"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```{figure} ./images/testing.png\n",
"---\n",
"width: 600px\n",
"name: testing\n",
"---\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
Expand Down
312 changes: 156 additions & 156 deletions _sources/augmented-generation.ipynb

Large diffs are not rendered by default.

9 changes: 6 additions & 3 deletions analyze-text.html

Large diffs are not rendered by default.

538 changes: 165 additions & 373 deletions augmented-generation.html

Large diffs are not rendered by default.

10 changes: 7 additions & 3 deletions function-calling.html
Original file line number Diff line number Diff line change
Expand Up @@ -545,7 +545,7 @@ <h2>Flight Status Query Project<a class="headerlink" href="#flight-status-query-
</div>
</div>
<div class="cell_output docutils container">
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>ChatCompletion(id=&#39;chatcmpl-8KdsKyVKlTTj7r1Th1avjlaUgNckb&#39;, choices=[Choice(finish_reason=&#39;function_call&#39;, index=0, message=ChatCompletionMessage(content=None, role=&#39;assistant&#39;, function_call=FunctionCall(arguments=&#39;{\n &quot;airline_code&quot;: &quot;UA&quot;,\n &quot;flight_number&quot;: 792,\n &quot;day&quot;: 12,\n &quot;month&quot;: 11,\n &quot;year&quot;: 2023\n}&#39;, name=&#39;get_flight_status&#39;), tool_calls=None))], created=1699930532, model=&#39;gpt-3.5-turbo-0613&#39;, object=&#39;chat.completion&#39;, system_fingerprint=None, usage=CompletionUsage(completion_tokens=48, prompt_tokens=119, total_tokens=167))
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>ChatCompletion(id=&#39;chatcmpl-8Kf6jYXZfCWc15UwbtGoLZnDI4cYM&#39;, choices=[Choice(finish_reason=&#39;function_call&#39;, index=0, message=ChatCompletionMessage(content=None, role=&#39;assistant&#39;, function_call=FunctionCall(arguments=&#39;{\n &quot;airline_code&quot;: &quot;UA&quot;,\n &quot;flight_number&quot;: 792,\n &quot;day&quot;: 12,\n &quot;month&quot;: 11,\n &quot;year&quot;: 2023\n}&#39;, name=&#39;get_flight_status&#39;), tool_calls=None))], created=1699935269, model=&#39;gpt-3.5-turbo-0613&#39;, object=&#39;chat.completion&#39;, system_fingerprint=None, usage=CompletionUsage(completion_tokens=48, prompt_tokens=119, total_tokens=167))
</pre></div>
</div>
</div>
Expand Down Expand Up @@ -628,7 +628,7 @@ <h2>Flight Status Query Project<a class="headerlink" href="#flight-status-query-
</div>
</div>
<div class="cell_output docutils container">
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>ChatCompletion(id=&#39;chatcmpl-8KdsMOi5piP8OLLyA4HZViaUSmb68&#39;, choices=[Choice(finish_reason=&#39;stop&#39;, index=0, message=ChatCompletionMessage(content=&#39;The flight status of UA 792 for November 12, 2023 is on time. It is scheduled to depart at 06:00 CST and arrive at 09:06 EST.&#39;, role=&#39;assistant&#39;, function_call=None, tool_calls=None))], created=1699930534, model=&#39;gpt-3.5-turbo-0613&#39;, object=&#39;chat.completion&#39;, system_fingerprint=None, usage=CompletionUsage(completion_tokens=40, prompt_tokens=150, total_tokens=190))
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>ChatCompletion(id=&#39;chatcmpl-8Kf6kN0wHNp9MTitsGjU30a1cfIov&#39;, choices=[Choice(finish_reason=&#39;stop&#39;, index=0, message=ChatCompletionMessage(content=&#39;The flight status of UA 792 for Nov 12, 2023 is as follows: \n\n- On time\n- Departing at 06:00 CST (Central Standard Time)\n- Arriving at 09:06 EST (Eastern Standard Time)&#39;, role=&#39;assistant&#39;, function_call=None, tool_calls=None))], created=1699935270, model=&#39;gpt-3.5-turbo-0613&#39;, object=&#39;chat.completion&#39;, system_fingerprint=None, usage=CompletionUsage(completion_tokens=53, prompt_tokens=150, total_tokens=203))
</pre></div>
</div>
</div>
Expand All @@ -643,7 +643,11 @@ <h2>Flight Status Query Project<a class="headerlink" href="#flight-status-query-
</div>
</div>
<div class="cell_output docutils container">
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>The flight status of UA 792 for November 12, 2023 is on time. It is scheduled to depart at 06:00 CST and arrive at 09:06 EST.
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>The flight status of UA 792 for Nov 12, 2023 is as follows:

- On time
- Departing at 06:00 CST (Central Standard Time)
- Arriving at 09:06 EST (Eastern Standard Time)
</pre></div>
</div>
</div>
Expand Down
Binary file modified objects.inv
Binary file not shown.
102 changes: 73 additions & 29 deletions reports/augmented-generation.err.log
Original file line number Diff line number Diff line change
@@ -1,16 +1,3 @@
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 778, in _async_poll_for_reply
msg = await ensure_async(self.kc.shell_channel.get_msg(timeout=new_timeout))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/jupyter_core/utils/__init__.py", line 189, in ensure_async
result = await obj
^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/jupyter_client/channels.py", line 315, in get_msg
raise Empty
_queue.Empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/jupyter_cache/executors/utils.py", line 58, in single_nb_execution
executenb(
Expand All @@ -25,20 +12,77 @@ Traceback (most recent call last):
^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 705, in async_execute
await self.async_execute_cell(
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 1001, in async_execute_cell
exec_reply = await self.task_poll_for_reply
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 802, in _async_poll_for_reply
error_on_timeout_execute_reply = await self._async_handle_timeout(timeout, cell)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 852, in _async_handle_timeout
raise CellTimeoutError.error_from_timeout_and_cell(
nbclient.exceptions.CellTimeoutError: A cell timed out while it was being executed, after 30 seconds.
The message was: Cell execution timed out.
Here is a preview of the cell contents:
-------------------
['# Ask GPT-3 about the Python version', 'prompt = "How do I find an element by class name in the latest version of python selenium?"', '', '# Generate response using GPT-3', 'client = openai.OpenAI()']
...
[')', '', '# Display the generated text', 'generated_text = response.choices[0].message.content', 'print(f"Answer: {generated_text}")']
-------------------
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 1058, in async_execute_cell
await self._check_raise_for_error(cell, cell_index, exec_reply)
File "/opt/homebrew/Caskroom/miniconda/base/envs/llm-env/lib/python3.11/site-packages/nbclient/client.py", line 914, in _check_raise_for_error
raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
------------------
# 1.) Load
url = "https://www.selenium.dev/documentation/webdriver/troubleshooting/upgrade_to_selenium_4/"
response = requests.get(url)
webpage_content = response.text

# 2.) Transform - Split the content into smaller chunks
text_splitter = CharacterTextSplitter(chunk_size=1500, separator="\n")
chunks = text_splitter.split_text(webpage_content)

# 3.) Embed
embeddings = OpenAIEmbeddings(openai_api_key=os.getenv('OPENAI_API_KEY'))
metadata = [{"source": url} for _ in range(len(chunks))] # Metadata for each chunk

# 4.) Store
store = FAISS.from_texts(chunks, embeddings, metadatas=metadata)
store.index = index

# 5.) Retrieve

# Build the question answering chain
chain = VectorDBQAWithSourcesChain.from_llm(
llm=OpenAI(openai_api_key=os.getenv('OPENAI_API_KEY'),
temperature=0, max_tokens=1500,
model_name='text-davinci-003'),
vectorstore=store
)

# Ask GPT a question
# question = "How do I find an element by class name in the latest version of python selenium? Show an example."
# result = chain({"question": question})

# Print the answer.
# print(f"Answer: {result['answer']}")

------------------

----- stderr -----
Created a chunk of size 4540, which is longer than the specified 1500
----- stderr -----
Created a chunk of size 1584, which is longer than the specified 1500
----- stderr -----
Created a chunk of size 52413, which is longer than the specified 1500
----- stderr -----
Created a chunk of size 1920, which is longer than the specified 1500
----- stderr -----
Created a chunk of size 3412, which is longer than the specified 1500
----- stderr -----
Created a chunk of size 3707, which is longer than the specified 1500
------------------

---------------------------------------------------------------------------
NameError Traceback (most recent call last)
Cell In[6], line 16
 14 # 4.) Store
 15 store = FAISS.from_texts(chunks, embeddings, metadatas=metadata)
---> 16 store.index = index
 18 # 5.) Retrieve
 19
 20 # Build the question answering chain
 21 chain = VectorDBQAWithSourcesChain.from_llm(
 22 llm=OpenAI(openai_api_key=os.getenv('OPENAI_API_KEY'),
 23 temperature=0, max_tokens=1500,
 24 model_name='text-davinci-003'),
 25 vectorstore=store
 26 )

NameError: name 'index' is not defined

2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

10 changes: 5 additions & 5 deletions setting-up-openai.html
Original file line number Diff line number Diff line change
Expand Up @@ -536,7 +536,7 @@ <h2>API<a class="headerlink" href="#api" title="Permalink to this heading">#</a>
</div>
</div>
<div class="cell_output docutils container">
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>ChatCompletion(id=&#39;chatcmpl-8KdsRXdMXZOtv5qsK7YSUa5EDhVIc&#39;, choices=[Choice(finish_reason=&#39;length&#39;, index=0, message=ChatCompletionMessage(content=&#39;As a lion tamer with extensive knowledge in zoology and large cat psychology, being in the same room as a lion is a profound mix of exhilaration, respect, and careful calculation. My academic background and practical experience help me understand lion behavior, which is crucial for safe interactions. The following aspects play into the experience:\n\n1. **Understanding Lion Behavior**: Being familiar with lion psychology and behavioral cues is essential. I am vigilant in watching for signs of agitation or distress, as well as indicators of curiosity or play. Recognizing these signals allows for better communication with the animal and helps prevent misunderstandings that could be dangerous.\n\n2. **Established Relationship**: My relationship with any lion is built over time through consistent interaction. Trust is a key component of this relationship, and it is earned through repeated positive experiences. This established relationship can make being in the same room with a lion less tense than it would be otherwise.\n\n3. **Safety Precautions**: Despite my expertise and established relationship with the lions, I never forget that they are powerful wild animals with instincts that can be unpredictable. I always have safety measures in place, which could include having trained colleagues nearby, keeping emergency equipment on hand, and ensuring there are barriers that can be quickly utilized if necessary.\n\n4. **&#39;, role=&#39;assistant&#39;, function_call=None, tool_calls=None))], created=1699930539, model=&#39;gpt-4-1106-preview&#39;, object=&#39;chat.completion&#39;, system_fingerprint=&#39;fp_a24b4d720c&#39;, usage=CompletionUsage(completion_tokens=256, prompt_tokens=53, total_tokens=309))
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>ChatCompletion(id=&#39;chatcmpl-8Kf6pg5DtYQhCaTMz4t40dWB2N6yQ&#39;, choices=[Choice(finish_reason=&#39;length&#39;, index=0, message=ChatCompletionMessage(content=&#39;As a lion tamer with extensive knowledge in zoology and large cat psychology, being in the same room as a lion is a profound mix of exhilaration, respect, and careful calculation. My academic background and practical experience help me understand lion behavior, which is crucial for safe interactions. The following aspects play into the experience:\n\n1. **Understanding Lion Behavior**: Being familiar with lion psychology, I can recognize subtle cues and signals in the lion’s body language, vocalizations, and overall demeanor. This allows me to predict and interpret their actions to a certain degree, helping to minimize risk for both the lion and myself.\n\n2. **Respect for Power and Wild Instinct**: Despite the lion’s potential habituation to human presence, I never forget that a lion is an apex predator with wild instincts. Their sheer physical power, with muscular bodies capable of taking down large prey, commands immense respect. Awareness of their capabilities is paramount.\n\n3. **Commanding Presence**: As a lion tamer, I must assert a calm but assertive presence to maintain authority without provoking aggressive responses. It’s necessary to exude confidence but not arrogance, as animals like lions are very adept at picking up on fear or uncertainty.\n\n4. **Safety Precautions**: I am always mindful&#39;, role=&#39;assistant&#39;, function_call=None, tool_calls=None))], created=1699935275, model=&#39;gpt-4-1106-preview&#39;, object=&#39;chat.completion&#39;, system_fingerprint=&#39;fp_a24b4d720c&#39;, usage=CompletionUsage(completion_tokens=256, prompt_tokens=53, total_tokens=309))
</pre></div>
</div>
</div>
Expand All @@ -559,13 +559,13 @@ <h2>API<a class="headerlink" href="#api" title="Permalink to this heading">#</a>
</div>
<p><span class="pasted-text">As a lion tamer with extensive knowledge in zoology and large cat psychology, being in the same room as a lion is a profound mix of exhilaration, respect, and careful calculation. My academic background and practical experience help me understand lion behavior, which is crucial for safe interactions. The following aspects play into the experience:

1. **Understanding Lion Behavior**: Being familiar with lion psychology and behavioral cues is essential. I am vigilant in watching for signs of agitation or distress, as well as indicators of curiosity or play. Recognizing these signals allows for better communication with the animal and helps prevent misunderstandings that could be dangerous.
1. **Understanding Lion Behavior**: Being familiar with lion psychology, I can recognize subtle cues and signals in the lion’s body language, vocalizations, and overall demeanor. This allows me to predict and interpret their actions to a certain degree, helping to minimize risk for both the lion and myself.

2. **Established Relationship**: My relationship with any lion is built over time through consistent interaction. Trust is a key component of this relationship, and it is earned through repeated positive experiences. This established relationship can make being in the same room with a lion less tense than it would be otherwise.
2. **Respect for Power and Wild Instinct**: Despite the lion’s potential habituation to human presence, I never forget that a lion is an apex predator with wild instincts. Their sheer physical power, with muscular bodies capable of taking down large prey, commands immense respect. Awareness of their capabilities is paramount.

3. **Safety Precautions**: Despite my expertise and established relationship with the lions, I never forget that they are powerful wild animals with instincts that can be unpredictable. I always have safety measures in place, which could include having trained colleagues nearby, keeping emergency equipment on hand, and ensuring there are barriers that can be quickly utilized if necessary.
3. **Commanding Presence**: As a lion tamer, I must assert a calm but assertive presence to maintain authority without provoking aggressive responses. It’s necessary to exude confidence but not arrogance, as animals like lions are very adept at picking up on fear or uncertainty.

4. **</span></p>
4. **Safety Precautions**: I am always mindful</span></p>
<div class="admonition-usage-https-platform-openai-com-usage admonition">
<p class="admonition-title"><a class="reference external" href="https://platform.openai.com/usage">Usage</a></p>
<figure class="align-default" id="openai-billing">
Expand Down

0 comments on commit 5b66d64

Please sign in to comment.