Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q: anyone got success using local koboldcpp instead of llama-cpp-python? #116

Open
okias opened this issue Aug 26, 2024 · 0 comments
Open

Comments

@okias
Copy link

okias commented Aug 26, 2024

$subj, I need OpenCL support, so since llama-cpp dropped the CLBlast, I tried to use koboldcpp, but it doesn't seems to produce anything useful so far. If it works for anyone, I would be grateful for the config you use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant