Skip to content

Commit

Permalink
community[minor]: Fixed ChatWebLLM reload function and updated model …
Browse files Browse the repository at this point in the history
…name in example (#5671)

* [fix] Fix parameter order of reload and update model name in example

* Update example

---------

Co-authored-by: jacoblee93 <[email protected]>
  • Loading branch information
kaiwinut and jacoblee93 authored Jun 7, 2024
1 parent 66ff096 commit 2de06ad
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion examples/src/models/chat/integration_webllm.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import { HumanMessage } from "@langchain/core/messages";
// Or by importing it via:
// import { prebuiltAppConfig } from "@mlc-ai/web-llm";
const model = new ChatWebLLM({
model: "Phi2-q4f32_1",
model: "Phi-3-mini-4k-instruct-q4f16_1-MLC",
chatOptions: {
temperature: 0.5,
},
Expand Down
6 changes: 3 additions & 3 deletions libs/langchain-community/src/chat_models/webllm.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ export interface WebLLMCallOptions extends BaseLanguageModelCallOptions {}
* ```typescript
* // Initialize the ChatWebLLM model with the model record.
* const model = new ChatWebLLM({
* model: "Phi2-q4f32_1",
* model: "Phi-3-mini-4k-instruct-q4f16_1-MLC",
* chatOptions: {
* temperature: 0.5,
* },
Expand Down Expand Up @@ -79,8 +79,8 @@ export class ChatWebLLM extends SimpleChatModel<WebLLMCallOptions> {

async reload(
modelId: string,
newAppConfig?: webllm.AppConfig,
newChatOpts?: webllm.ChatOptions
newChatOpts?: webllm.ChatOptions,
newAppConfig?: webllm.AppConfig
) {
await this.engine.reload(modelId, newChatOpts, newAppConfig);
}
Expand Down

0 comments on commit 2de06ad

Please sign in to comment.