Skip to content

Commit

Permalink
fix: open ws on catai remote
Browse files Browse the repository at this point in the history
* fix #65

* fix: docs

* fix: wait for ws to open

---------

Co-authored-by: ido <[email protected]>
  • Loading branch information
scenaristeur and ido-pluto authored Jan 26, 2024
1 parent 3b0d750 commit cc54125
Show file tree
Hide file tree
Showing 7 changed files with 43 additions and 19 deletions.
14 changes: 7 additions & 7 deletions docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,18 +73,18 @@ while (true) {
### Advanced API

This API is only available only in Node.js.
[demo](../examples/remotecall.js)

```ts
import {RemoteCatAI} from 'catai';
import progress from 'progress-stream';
```js
import { RemoteCatAI } from "catai";

const catai = new RemoteCatAI('ws://localhost:3000');
const catai = new RemoteCatAI("ws://localhost:3000");

const response = await catai.prompt('Write me 100 words story', token => {
progress.stdout.write(token);
const response = await catai.prompt("Write me 100 words story", (token) => {
process.stdout.write(token);
});

console.log(`Total text length: ${response.length}`);

catai.close();

```
2 changes: 1 addition & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ You can config the model by the following steps:

[LLamaChatPromptOptions](https://withcatai.github.io/node-llama-cpp/api/type-aliases/LLamaChatPromptOptions)

You can edit the [systemPrompt](system_prompt.md) of the chat too.
You can edit the [systemPrompt](system-prompt.md) of the chat too.


3. Restart the server.
Expand Down
8 changes: 4 additions & 4 deletions docs/system_prompt.md → docs/system-prompt.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
# CatAi system_prompt
# CatAi system-prompt

According to https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaChatSessionOptions,
it is possible to modify the system_prompt of a chat.
it is possible to modify the system-prompt of a chat.

This can be achieved by adding a systemPrompt key in modelSettings

![catAi systemPrompt settings](system_prompt/settings.png)
![CatAi systemPrompt settings](system-prompt/settings.png)


Save and restart to apply.

Then the chat act like a pirate according to the systemPrompt you choose ;-)

![catAi systemPrompt demo](system_prompt/demo.png)
![CatAi systemPrompt demo](system-prompt/demo.png)
File renamed without changes
File renamed without changes
13 changes: 13 additions & 0 deletions examples/remotecall.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
import { RemoteCatAI } from "catai";

const catai = new RemoteCatAI("ws://localhost:3000");

catai.on("open", async () => {
console.log("Connected");
const response = await catai.prompt("Write me 100 words story", (token) => {
process.stdout.write(token);
});

console.log(`Total text length: ${response.length}`);
catai.close();
});
25 changes: 18 additions & 7 deletions server/src/server/remote/remote-catai.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
import WebSocket, {ClientOptions} from 'ws';
import {ClientRequestArgs} from 'http';
import {ChatContext} from '../../manage-models/bind-class/chat-context.js';
import WebSocket, { ClientOptions } from 'ws';
import { ClientRequestArgs } from 'http';
import { ChatContext } from '../../manage-models/bind-class/chat-context.js';

export default class RemoteCatAI extends ChatContext {
private _ws: WebSocket;
private _closed = false;
private _promiseOpen?: Promise<void>;

/**
* Connect to remote CatAI server, and use it as a chat context
Expand All @@ -28,10 +29,19 @@ export default class RemoteCatAI extends ChatContext {
if (this._closed) return;
this.emit('error', 'Connection closed: ' + code);
});

this._ws.on('open', () => {
this.emit("open");
});

this._promiseOpen = new Promise((resolve, reject) => {
this.once('open', resolve);
this.once('error', reject);
});
}

private _onMessage(message: string) {
const {event, value} = JSON.parse(message);
const { event, value } = JSON.parse(message);
switch (event) {
case 'token':
this.emit('token', value);
Expand All @@ -49,14 +59,15 @@ export default class RemoteCatAI extends ChatContext {
}

private _send(event: 'prompt' | 'abort', value: string) {
this._ws.send(JSON.stringify({event, value}));
this._ws.send(JSON.stringify({ event, value }));
}

abort(reason?: string): void {
this._send('abort', reason || 'Aborted by user');
}

prompt(prompt: string, onToken?: (token: string) => void): Promise<string | null> {
async prompt(prompt: string, onToken?: (token: string) => void): Promise<string | null> {
await this._promiseOpen;
this._send('prompt', prompt);

let buildText = '';
Expand All @@ -66,7 +77,7 @@ export default class RemoteCatAI extends ChatContext {
};
this.on('token', tokenEvent);

return new Promise<string | null>((resolve, reject) => {
return await new Promise<string | null>((resolve, reject) => {
this.once('error', reject);
this.once('modelResponseEnd', () => {
this.off('token', tokenEvent);
Expand Down

0 comments on commit cc54125

Please sign in to comment.