-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors on vite.js project #142
Comments
ollama-js is just a library for interacting with the Ollama API, which you need to be hosting separately. The reason you get ERR_CONNECTION_REFUSED with the first code is because nothing is listening on that port (presumably because you're not running Ollama). The reason you get 404 with the second code is because your Vite project doesn't have a |
ok thank you for your response. |
The purpose of Ollama is to locally host LLMs, yes
Ollama isn't a library, ollama-js is
You can still use your ollama-js code to interact with the Ollama API however you like, the Ollama API isn't limited with what you can do with it |
Thank you for your response. |
Ollama is designed for you to download models via Ollama, not via external sites. You might be able to use the model you have already downloaded by creating a Modelfile, which is documented here |
no, I let it go: |
Not sure about other OSs, but on Windows it downloads to
If you're meaning through Ollama, Ollama doesn't have a built-in web ui, you need a third-party one such as open-webui. All Ollama does is provide an API for interfacing with models created via Ollama.
Would you mind linking me to what you are referencing as the documentation, because the docs seem perfectly clear to me |
Hi, Follow step 1: Follow step 2: Secondly, what's the point of CREATE, PULL, PUSH? "GGUF", but what is that??? Modelfile: ‘You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.’ I think I'll tell my client that his aspirations in terms of AI are doomed to failure, especially as he doesn't have the colossal resources (and neither do I) to embark on this adventure. All the best, |
@Raf-sns this has nothing to do with ollama-JS, you just need to familiarize yourself with ollama itself, and fortunately it's dead simple https://github.com/ollama/ollama/blob/main/README.md#quickstart |
What are you referring to here? Linux isn't required to use Ollama, and nothing in the Ollama docs, or the README I assume you're reading, mentions installing Linux
As I keep explaining, Ollama is just an API, it does not have a built-in web UI, so there is no URL to go to. You interact with it either directly from the terminal, which is what
Correct
The llama3.1 model
This is clearly explained in the docs, and you shouldn't need to worry about these anyways for basic usage.
It's a file format used for compressed AI models
That Modelfile, or rather system message, is just an example. The point of that example is to show that you can set it to whatever you want! |
You seem to be assuming that Ollama does a lot of things it never claimed to do, and then getting frustrated because you can't figure out how to make it do those things. Please just understand that all Ollama does on it's own is provide an API, for other tools such as ollama-js and open-webui to interface with, as well as some some terminal commands for downloading, creating and testing models. Once you have Ollama installed and running, the Vite project you mentioned at the start of this PR should work as expected, and you can go from there. If all you're wanting to do is interact with llama-3.1, then you do not need to worry about any of the terminal commands because all of the third-party tools you would traditionally use to interact with the Ollama API will handle all of that for you, including your JS code. |
Hi, Maybe, but it's my system, and Linux is mentioned in the two links below: https://ollama.com/download -> I don't want to argue, you asked me what I found unclear in the explanations and I simply answered you, with my resentment, otherwise, I would not have done it. Regards, |
Ok, I thought you were saying that step 1 to using Ollama was to install Linux itself, I'm guessing you meant that it was to install the Linux version of Ollama |
Hi,
I'm trying to use ollama-js in a vite.js project.
What I did:
initialize a new vite.js project:
npm create vite@latest
Project name: test-ollama
Selected: Vanilla
Select a variant: JavaScript
cd test-ollama
npm i ollama
npm install
npm run dev
on main.js:
Error in console :
POST http://127.0.0.1:11434/api/chat net::ERR_CONNECTION_REFUSED
What i tried :
Error in console :
POST http://127.0.0.1:5173/api/chat 404 (Not Found)
I don't really understand why I can't connect to Ollama.
Additionally, I downloaded version 8B of llama
-> Meta-Llama-3.1-8B
I don't understand how I could connect to this place with ollama.
My end goal is to serve a fine-tuned version of llama 8B on a website.
Thank you for your answers, kind regards,
Raf
The text was updated successfully, but these errors were encountered: