Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tools do not work with ollama #254

Open
ColtonMcInroy opened this issue Jan 11, 2024 · 2 comments
Open

Tools do not work with ollama #254

ColtonMcInroy opened this issue Jan 11, 2024 · 2 comments

Comments

@ColtonMcInroy
Copy link

Changing the example MiddleSchoolMathAgent.ts from openai to ollama results in the following error:
TypeError: model.doGenerateToolCalls is not a function

@lgrammel
Copy link
Collaborator

With Ollama, you need to specify how you go from text output to tools. There are some examples here: https://github.com/lgrammel/modelfusion/tree/main/examples/basic/src/model-provider/ollama

E.g.:

 const { tool, args, toolCall, result } = await useTool(
    ollama
      .CompletionTextGenerator({
        model: "mistral",
        promptTemplate: ollama.prompt.Mistral,
        raw: true, // required when using custom prompt template
        format: "json",
        temperature: 0,
        stopSequences: ["\n\n"], // prevent infinite generation
      })
      .withInstructionPrompt()
      .asToolCallGenerationModel(jsonToolCallPrompt.text()),

    new MathJsTool({ name: "calculator" }),
    "What's fourteen times twelve?"
  );

  console.log(`Tool call`, toolCall);
  console.log(`Tool: ${tool}`);
  console.log(`Arguments: ${JSON.stringify(args)}`);
  console.log(`Result: ${result}`);

That being said, chat w/ tools is not supported yet w/ Ollama, and the interface for creating tool models might change.

@Boorj
Copy link

Boorj commented Dec 25, 2024

Now it supports:

Ollama now supports tool calling with popular models such as Llama 3.1. This enables a model to answer a given prompt using tool(s) it knows about, making it possible for models to perform more complex tasks or interact with the outside world.

Ollama now supports structured outputs making it possible to constrain a model’s output to a specific format defined by a JSON schema.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants