-
Notifications
You must be signed in to change notification settings - Fork 15.9k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'master' into erick/docs-platforms-providers
- Loading branch information
Showing
44 changed files
with
279 additions
and
140 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,21 @@ | ||
# KoNLPY | ||
|
||
>[KoNLPy](https://konlpy.org/) is a Python package for natural language processing (NLP) | ||
> of the Korean language. | ||
|
||
## Installation and Setup | ||
|
||
You need to install the `konlpy` python package. | ||
|
||
```bash | ||
pip install konlpy | ||
``` | ||
|
||
## Text splitter | ||
|
||
See a [usage example](/docs/how_to/split_by_token/#konlpy). | ||
|
||
```python | ||
from langchain_text_splitters import KonlpyTextSplitter | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
# Kùzu | ||
|
||
>[Kùzu](https://kuzudb.com/) is a company based in Waterloo, Ontario, Canada. | ||
> It provides a highly scalable, extremely fast, easy-to-use [embeddable graph database](https://github.com/kuzudb/kuzu). | ||
|
||
|
||
## Installation and Setup | ||
|
||
You need to install the `kuzu` python package. | ||
|
||
```bash | ||
pip install kuzu | ||
``` | ||
|
||
## Graph database | ||
|
||
See a [usage example](/docs/integrations/graphs/kuzu_db). | ||
|
||
```python | ||
from langchain_community.graphs import KuzuGraph | ||
``` | ||
|
||
## Chain | ||
|
||
See a [usage example](/docs/integrations/graphs/kuzu_db/#creating-kuzuqachain). | ||
|
||
```python | ||
from langchain.chains import KuzuQAChain | ||
``` | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
# LlamaIndex | ||
|
||
>[LlamaIndex](https://www.llamaindex.ai/) is the leading data framework for building LLM applications | ||
|
||
## Installation and Setup | ||
|
||
You need to install the `llama-index` python package. | ||
|
||
```bash | ||
pip install llama-index | ||
``` | ||
|
||
See the [installation instructions](https://docs.llamaindex.ai/en/stable/getting_started/installation/). | ||
|
||
## Retrievers | ||
|
||
### LlamaIndexRetriever | ||
|
||
>It is used for the question-answering with sources over an LlamaIndex data structure. | ||
```python | ||
from langchain_community.retrievers.llama_index import LlamaIndexRetriever | ||
``` | ||
|
||
### LlamaIndexGraphRetriever | ||
|
||
>It is used for question-answering with sources over an LlamaIndex graph data structure. | ||
```python | ||
from langchain_community.retrievers.llama_index import LlamaIndexGraphRetriever | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
# LlamaEdge | ||
|
||
>[LlamaEdge](https://llamaedge.com/docs/intro/) is the easiest & fastest way to run customized | ||
> and fine-tuned LLMs locally or on the edge. | ||
> | ||
>* Lightweight inference apps. `LlamaEdge` is in MBs instead of GBs | ||
>* Native and GPU accelerated performance | ||
>* Supports many GPU and hardware accelerators | ||
>* Supports many optimized inference libraries | ||
>* Wide selection of AI / LLM models | ||
|
||
|
||
## Installation and Setup | ||
|
||
See the [installation instructions](https://llamaedge.com/docs/user-guide/quick-start-command). | ||
|
||
## Chat models | ||
|
||
See a [usage example](/docs/integrations/chat/llama_edge). | ||
|
||
```python | ||
from langchain_community.chat_models.llama_edge import LlamaEdgeChatService | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
# llamafile | ||
|
||
>[llamafile](https://github.com/Mozilla-Ocho/llamafile) lets you distribute and run LLMs | ||
> with a single file. | ||
>`llamafile` makes open LLMs much more accessible to both developers and end users. | ||
> `llamafile` is doing that by combining [llama.cpp](https://github.com/ggerganov/llama.cpp) with | ||
> [Cosmopolitan Libc](https://github.com/jart/cosmopolitan) into one framework that collapses | ||
> all the complexity of LLMs down to a single-file executable (called a "llamafile") | ||
> that runs locally on most computers, with no installation. | ||
|
||
## Installation and Setup | ||
|
||
See the [installation instructions](https://github.com/Mozilla-Ocho/llamafile?tab=readme-ov-file#quickstart). | ||
|
||
## LLMs | ||
|
||
See a [usage example](/docs/integrations/llms/llamafile). | ||
|
||
```python | ||
from langchain_community.llms.llamafile import Llamafile | ||
``` | ||
|
||
## Embedding models | ||
|
||
See a [usage example](/docs/integrations/text_embedding/llamafile). | ||
|
||
```python | ||
from langchain_community.embeddings import LlamafileEmbeddings | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.