Skip to content

Commit

Permalink
Deployed 2ebcaf4 with MkDocs version: 1.5.3
Browse files Browse the repository at this point in the history
  • Loading branch information
ossirytk committed May 1, 2024
1 parent bb90296 commit 02a22fa
Show file tree
Hide file tree
Showing 7 changed files with 31 additions and 16 deletions.
4 changes: 4 additions & 0 deletions configs/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,10 @@ <h3 id="basic-configs">Basic Configs</h3>
<td>spacy/hugginface model name (needs to be installed)</td>
</tr>
<tr>
<td>CUSTOM_CSS</td>
<td>Url to the custom css file to be used by the application.</td>
</tr>
<tr>
<td>VECTOR_K</td>
<td>Fetch k closest embeddings for mmr</td>
</tr>
Expand Down
2 changes: 1 addition & 1 deletion index.html
Original file line number Diff line number Diff line change
Expand Up @@ -161,5 +161,5 @@ <h1 id="llama-cpp-chat-memory">llama-cpp-chat-memory</h1>

<!--
MkDocs version : 1.5.3
Build Date UTC : 2024-04-28 11:41:09.777907+00:00
Build Date UTC : 2024-05-01 08:21:35.760401+00:00
-->
6 changes: 6 additions & 0 deletions running_the_chatbot/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -113,10 +113,16 @@ <h3 id="running-the-chatbot">Running the chatbot</h3>
<p>If you call chainlit directly, the character name and avatar picture won't update.</p>
<p>Note: Currently something seems to be cached by chainlit. Until I find a way to clear the cache,
you need to call run_chat twice for changes to take effect.</p>
<p>Some browsers don't allow loading css file from local directories. For testing purposes there is a flask script to run a simple http server that serves stylesheets from the "static/" directory. You will need to run the flask server in another terminal instance.</p>
<pre><code>cd src\llama_cpp_langchain_chat
python -m run_chat
</code></pre>
<p>The chatbot should open in your browser<BR></p>
<p>Running flask</p>
<pre><code>hatch shell chat
cd .\src\llama_cpp_chat_memory\
flask --app flask_web_server run
</code></pre>

</div>
</div><footer>
Expand Down
7 changes: 6 additions & 1 deletion running_the_env/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -107,12 +107,17 @@
<div class="section" itemprop="articleBody">

<h3 id="running-the-env">Running the env</h3>
<p>You'll need to run all the commands inside the virtual env.</p>
<p>You'll need to run all the commands inside the virtual env. Some browsers don't allow loading css file from local directories. For testing purposes there is a flask script to run a simple http server that serves stylesheets from the "static/" directory. You will need to run the flask server in another terminal instance.</p>
<pre><code>hatch shell chat
(optional for cuda support)$env:FORCE_CMAKE=1
(optional for cuda support)$env:CMAKE_ARGS=&quot;-DLLAMA_CUBLAS=on&quot;
(optional for cuda support)pip install llama-cpp-python==VERSION --force-reinstall --upgrade --no-cache-dir --no-deps
cd src\llama_cpp_langchain_chat
</code></pre>
<p>Running flask</p>
<pre><code>hatch shell chat
cd .\src\llama_cpp_chat_memory\
flask --app flask_web_server run
</code></pre>

</div>
Expand Down
2 changes: 1 addition & 1 deletion search/search_index.json

Large diffs are not rendered by default.

26 changes: 13 additions & 13 deletions sitemap.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,67 +2,67 @@
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/UNLICENSE/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/card_format/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/configs/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/creating_embeddings/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/examples/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/getting_started/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/named_entity_recognition/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/preparing_the_env/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/prompt_support/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/running_the_chatbot/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/running_the_env/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://ossirytk.github.io/llama-cpp-chat-memory/webscraping/</loc>
<lastmod>2024-04-28</lastmod>
<lastmod>2024-05-01</lastmod>
<changefreq>daily</changefreq>
</url>
</urlset>
Binary file modified sitemap.xml.gz
Binary file not shown.

0 comments on commit 02a22fa

Please sign in to comment.