-
-
Notifications
You must be signed in to change notification settings - Fork 30
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* feat(cli): node llama cpp (#59) * feat(cli): node-llama-cpp * docs(troubleshooting: node-llama-cpp * fix(node-llama-cpp): cli * feat: beta version of node-llama-cpp * ci: beta version * fix: Markdown list * fix: auto model tag * fix: model settings * docs: troubleshooting * fix: node-llama-cpp beta deps * feat(node-llama-cpp): update to the latest version & APIs, including function calling and json scheme * feat(api): new api that integrates with node-llama-cpp@beta * ci: require approve * fix: bump node-llama-cpp@beta to latest * fix: better errors
- Loading branch information
Showing
33 changed files
with
983 additions
and
868 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,41 @@ | ||
# Troubleshooting | ||
|
||
Some common problems and solutions. | ||
|
||
|
||
## I can't connect to the server | ||
|
||
If the server is disconnected without any error, it's probably a problem with the llama.cpp binaries. | ||
|
||
The solution is to recompile the binaries: | ||
```bash | ||
catai cpp | ||
``` | ||
|
||
## How to change the download location? | ||
|
||
You can configure the download location by changing the `CATAI_DIR` environment variable. | ||
|
||
More environment variables configuration can be found in the [configuration](https://withcatai.github.io/catai/interfaces/_internal_.Config.html#CATAI_DIR) | ||
|
||
## Cuda Support | ||
|
||
In case you have a GPU that supports CUDA, but the server doesn't recognize it, you can try to install the CUDA toolkit, | ||
and rebuild the binaries. | ||
|
||
Rebuild the binaries with CUDA support: | ||
|
||
``` | ||
catai cpp --cuda | ||
``` | ||
|
||
In case of an error, check the cuda | ||
troubleshooting [here](https://withcatai.github.io/node-llama-cpp/guide/CUDA#fix-the-failed-to-detect-a-default-cuda-architecture-build-error). | ||
|
||
## Unsupported processor / Exit without error | ||
|
||
In case you have an unsupported processor, you can try to rebuild the binaries. | ||
|
||
``` | ||
catai cpp | ||
``` |
Oops, something went wrong.