Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问 此处 ollama 应该是写我本地的配置吧 #36

Open
labixiaoyuan opened this issue Jun 19, 2024 · 8 comments
Open

请问 此处 ollama 应该是写我本地的配置吧 #36

labixiaoyuan opened this issue Jun 19, 2024 · 8 comments

Comments

@labixiaoyuan
Copy link

image

@labixiaoyuan
Copy link
Author

image
image
我在代码里加了ollama配置,启动后,还是一如既往的没后续。
image
请问,哪位大神能提供好的建议呢?

@GALA009
Copy link

GALA009 commented Jun 20, 2024

你本地 ollama 要有TS 定义的模型,请求接口的地方要改成你本地有的模型

@labixiaoyuan
Copy link
Author

接口处已配置了本地模型。按照您的说法,我还需要有 TS模型

@ojeffreyo
Copy link

ojeffreyo commented Jun 29, 2024

I tried to add the API almost the same way, and I was trapped by the almost same problem.
The picture shows the way I added the local Ollama API.
keyapi

Then I run the Ollama server and start the Openglass web page, connecting the XIAO-ESP32-S3 via BLE. The ESP32-S3 actually passed the pictures to the Ollama server every time it took a picture. But when I tried to type some words in the text box, nothing appeared on the web page.
This is the termianl while running the Openglass and Ollama with moondream.
terminal

And this is the web page.
hand

So I tried to capture the token between the Openglass localhost and the Ollama server with the WireShark tool. I found that the Ollama server definitely processed images and generated the related outputs, but it just could not display any output on the webpage.
Here is the output captured by the WireShark.
reply

So my question is, what should I do to let the messages display on the chat box of the Web page?

@HaoHoo
Copy link

HaoHoo commented Jul 5, 2024

You may try the 'llava-llama3' on ollama. This is LM with vision.

@HaoHoo
Copy link

HaoHoo commented Jul 6, 2024

So my question is, what should I do to let the messages display on the chat box of the Web page?

Same question but not the same situation.

I changed the 'imageDescription.ts' to use the 'llava-llama3' model on the local ollama runtime.
It also can not get any response from the Expo page by input in the chat area.
But I can find the model response from browser development tools.

截屏2024-07-06 14 28 39

When I used Groq API, in the first round I could get model responses, but lost them when the image shows on the page.

@HaoHoo
Copy link

HaoHoo commented Jul 8, 2024

If you choose the 'moondream' model like the original code, you can use 'ollama pull moondream:1.8b-v2-fp16' to download the model.
image
And you also can reference the recording demo:
https://www.loom.com/share/4c5666ef283f4b33b4705a21c71fc461

@ojeffreyo
Copy link

If you choose the 'moondream' model like the original code, you can use 'ollama pull moondream:1.8b-v2-fp16' to download the model. image And you also can reference the recording demo: https://www.loom.com/share/4c5666ef283f4b33b4705a21c71fc461

Thank you for the material you provided, I will try this today.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants