ollama-ai-provider
v1.1.0
Published
Vercel AI Provider for running LLMs locally using Ollama
Downloads
93,929
Readme
Ollama Provider for the Vercel AI SDK
The Ollama Provider for the Vercel AI SDK contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.
Requirements
This provider requires Ollama >= 0.5.0
Setup
The Ollama provider is available in the ollama-ai-provider
module. You can install it with
npm i ollama-ai-provider
Provider Instance
You can import the default provider instance ollama
from ollama-ai-provider
:
import { ollama } from 'ollama-ai-provider';
Example
import { ollama } from 'ollama-ai-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('phi3'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
Documentation
Please check out the Ollama provider documentation for more information.