You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From the Ollama docs, "If an empty prompt is provided [to the generate endpoint], the model will be loaded into memory.". This is useful so that, for example, the user can be shown that the model is loading, and the user can continue to edit their prompt before a response actually starts being generated.
Technically, this could be solved by just making the prompt attribute of the GenerateRequest interface, shown below, optional
This way, it is clearer why the prompt argument is optional (since someone might assume it would behave the same as providing an empty string as the prompt), and so that it is clear that none of the other attributes for GenerateRequest will have an effect when doing this.
But, because I understand that ollama-js is generally supposed to be 1:1 with the actual API, I want to get some feedback on this first before I file a PR.
The text was updated successfully, but these errors were encountered:
hopperelec
changed the title
Provide a type-safe way to generate from an empty prompt (to pre-load a model)
Provide a type-safe way to call generate without a prompt (to pre-load a model)
Nov 11, 2024
From the Ollama docs, "If an empty prompt is provided [to the generate endpoint], the model will be loaded into memory.". This is useful so that, for example, the user can be shown that the model is loading, and the user can continue to edit their prompt before a response actually starts being generated.
Technically, this could be solved by just making the
prompt
attribute of the GenerateRequest interface, shown below, optionalollama-js/src/interfaces.ts
Line 49 in 6e7e496
However, I think it would be sensible for there to be a dedicated function, probably of the below form
This way, it is clearer why the
prompt
argument is optional (since someone might assume it would behave the same as providing an empty string as the prompt), and so that it is clear that none of the other attributes forGenerateRequest
will have an effect when doing this.But, because I understand that
ollama-js
is generally supposed to be 1:1 with the actual API, I want to get some feedback on this first before I file a PR.The text was updated successfully, but these errors were encountered: