Skip to content

Latest commit

 

History

History
68 lines (36 loc) · 3.29 KB

ai.md

File metadata and controls

68 lines (36 loc) · 3.29 KB

Adding you own self-hosted AI bot to Social Stream Ninja

In this guide we will do a VERY basic setup of the Llama3 LLM, using Ollama, on Windows. It should work well on a variety of systems, including modern Nvidia GPUs and newer MacOS systems.

Installing Ollama

https://ollama.com

image

Installing an LLM model

There's many choices available; go to https://ollama.com/library for a list of options.

Social Stream Ninja targets Llama3.2 by default, but you can specify the model to use the Social Stream Ninja menu. For now though, let's just use llama3.2

To install the model, lets open Command Prompt (or Terminal) > ollama pull llama3.2

image

Once it finishes install the model, you should have the model installed, and the option to list available installed models should confirm it.

image

If you need to remove it, you can run ollama rm llama3.

image

It will be available for API access by default at http://localhost:11434, which if you open via the browser, it shoudl save Ollama is running. However, there are still issues with CORS we need to deal with if using the Chrome extension. That is, by default, Ollama won't listen to requests made by a Chrome extension.

To get around this CORS issue, for windows, you can try close Ollama.exe from the taskbar, and then run the following:

ollama serve stop
taskkill /F /IM ollama.exe
set OLLAMA_ORIGINS=chrome-extension://*
ollama serve

To make this CORS permission permanant on Windows, you need to add OLLAMA_ORIGINS=chrome-extension://* to the Windows user enviromental system variables. Then start/restart ollma; ollama serve.

image

This allows us to access Ollama from our Social Stream Ninja extension.

I don't believe you need to worry about CORS if using the Social Stream Ninja standalone app, HOWEVER, if you are using the standalone app, and are running into issues, you can set the OLLAMA_ORIGINS to *. If you're comfortable, you can set do this via the "Edit the system environment variables" in Windows, instead of via command line. Be sure to close and re-open Ollama if you do this however. Below is how I have it setup, and it works with both extension AND standalone app.

image

If you want to access Ollama directly via the dock.html page, with custom.js commands, you may need to host Ollama behind a reverse proxy service. Refer to their documentation for info on this.

Using

Just make sure the toggle is on, and that you have Ollama /w Llama3 installed/running locally, and you should be good to go now.

image

The Bot will respond automatically to chat if it thinks it's a good idea. There is a 5-second timeout per source site.

image

  • Steve ps. BLARGH!