Skip to content

dyad-sh/ollama-ai-provider-v2

 
 

Repository files navigation

Ollama Provider V2 for the Vercel AI SDK (now for SDK 5)

The Ollama Provider V2 for the AI SDK has been created as the original ollama-ai-provider was not being actively maintained.

This provider now supports:

  • tool streaming and calling for models
  • enable/disable thinking

Setup

The Ollama provider is available in the ollama-ai-provider-v2 module. You can install it with

npm i ollama-ai-provider-v2

To update an existing installation to the new major version that supports AI SDK 5, simply do

npm update ollama-ai-provider-v2

Provider Instance

You can import the default provider instance ollama from ollama-ai-provider-v2:

import { ollama } from 'ollama-ai-provider-v2';

Example

import { ollama } from 'ollama-ai-provider-v2';
import { generateText } from 'ai';

const { text } = await generateText({
  model: ollama('llama3.2:latest'),
  prompt: 'Write a meaty lasagna recipe for 4 people.',
});

Thinking mode toggle example

import { ollama } from 'ollama-ai-provider-v2';
import { generateText } from 'ai';

const { text } = await generateText({
  model: ollama('qwen3:4b'),
  providerOptions: { ollama: { think: true } },
  prompt: 'Write a meaty lasagna recipe for 4 people, but really think about it',
});

Documentation

Please check out the Ollama provider documentation for more information.

About

Vercel AI Provider for running LLMs locally using Ollama

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 99.4%
  • JavaScript 0.6%