-
Notifications
You must be signed in to change notification settings - Fork 271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error : localrag.py #10
Comments
I followed the following setup steps
|
hello! strange, and you have uploaded context to be embedding in vault.txt? |
PS D:\VM\AI\easy-local-rag> python .\localrag.py
Traceback (most recent call last): same problem here, yes i have context in the vault file. |
same here... works when the vault is empty python3 localrag.py |
The problem is the input data. If you upload, ex: well structured json file, its works like charm. |
I had the same issue and I think it's because of the results saved in the vault.txt file. |
Im getting an error after running the following command
"python localrag.py"
error logs:
Traceback (most recent call last):
File "/home/ubu1/easy-local-rag/localrag.py", line 130, in
response = ollama.embeddings(model='mxbai-embed-large', prompt=content)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubu1/miniconda3/envs/ragtest1/lib/python3.12/site-packages/ollama/_client.py", line 198, in embeddings
return self._request(
^^^^^^^^^^^^^^
File "/home/ubu1/miniconda3/envs/ragtest1/lib/python3.12/site-packages/ollama/_client.py", line 73, in _request
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: failed to generate embedding
The text was updated successfully, but these errors were encountered: