Skip to content

Commit

Permalink
Allow custom system prompt for Ollama functions (langchain-ai#3264)
Browse files Browse the repository at this point in the history
  • Loading branch information
jacoblee93 authored Nov 15, 2023
1 parent 4dbc702 commit 0e18ab0
Show file tree
Hide file tree
Showing 3 changed files with 75 additions and 1 deletion.
10 changes: 10 additions & 0 deletions docs/core_docs/docs/integrations/chat/ollama_functions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,13 @@ import OllamaFunctionsExtraction from "@examples/models/chat/ollama_functions/ex
<CodeBlock language="typescript">{OllamaFunctionsExtraction}</CodeBlock>

You can see a LangSmith trace of what this looks like here: https://smith.langchain.com/public/31457ea4-71ca-4e29-a1e0-aa80e6828883/r

## Customization

Behind the scenes, this uses Ollama's JSON mode to constrain output to JSON, then passes tools schemas as JSON schema into the prompt.

Because different models have different strengths, it may be helpful to pass in your own system prompt. Here's an example:

import OllamaFunctionsCustomPrompt from "@examples/models/chat/ollama_functions/custom_prompt.ts";

<CodeBlock language="typescript">{OllamaFunctionsCustomPrompt}</CodeBlock>
64 changes: 64 additions & 0 deletions examples/src/models/chat/ollama_functions/custom_prompt.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
import { OllamaFunctions } from "langchain/experimental/chat_models/ollama_functions";
import { HumanMessage } from "langchain/schema";
import { PromptTemplate } from "langchain/prompts";

// Custom system prompt to format tools. You must encourage the model
// to wrap output in a JSON object with "tool" and "tool_input" properties.
const toolSystemPrompt =
PromptTemplate.fromTemplate(`You have access to the following tools:
{tools}
To use a tool, respond with a JSON object with the following structure:
{{
"tool": <name of the called tool>,
"tool_input": <parameters for the tool matching the above JSON schema>
}}`);

const model = new OllamaFunctions({
temperature: 0.1,
model: "mistral",
toolSystemPrompt,
}).bind({
functions: [
{
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "The city and state, e.g. San Francisco, CA",
},
unit: { type: "string", enum: ["celsius", "fahrenheit"] },
},
required: ["location"],
},
},
],
// You can set the `function_call` arg to force the model to use a function
function_call: {
name: "get_current_weather",
},
});

const response = await model.invoke([
new HumanMessage({
content: "What's the weather in Boston?",
}),
]);

console.log(response);

/*
AIMessage {
content: '',
additional_kwargs: {
function_call: {
name: 'get_current_weather',
arguments: '{"location":"Boston, MA","unit":"fahrenheit"}'
}
}
}
*/
2 changes: 1 addition & 1 deletion langchain/src/experimental/chat_models/ollama_functions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ export class OllamaFunctions extends BaseChatModel<ChatOllamaFunctionsCallOption
} else if (functions.length === 0) {
functions.push(this.defaultResponseFunction);
}
const defaultContent = await TOOL_SYSTEM_PROMPT.format({
const defaultContent = await this.toolSystemPrompt.format({
tools: JSON.stringify(functions, null, 2),
});
const systemMessage = new SystemMessage({ content: defaultContent });
Expand Down

0 comments on commit 0e18ab0

Please sign in to comment.