@friendliai/ai-provider
v0.1.25
Published
Learn how to use the FriendliAI provider for the Vercel AI SDK.
Downloads
482
Readme
@friendliai/ai-provider
Learn how to use the FriendliAI provider for the Vercel AI SDK.
Installation
You can install the package via npm:
npm i @friendliai/ai-provider
Credentials
The tokens required for model usage can be obtained from the Friendli suite.
To use the provider, you need to set the FRIENDLI_TOKEN
environment variable with your personal access token.
export FRIENDLI_TOKEN="YOUR_FRIENDLI_TOKEN"
Check the FriendliAI documentation for more information.
Provider Instance
import { friendli } from "@friendliai/ai-provider";
Language Models
You can create FriendliAI models using a provider instance.
The first argument is the model id, e.g. meta-llama-3.1-8b-instruct
.
const model = friendli("meta-llama-3.1-8b-instruct");
Example: Generating text
You can use FriendliAI language models to generate text with the generateText
function:
import { friendli } from "@friendliai/ai-provider";
import { generateText } from 'ai'
const { text } = await generateText({
model: friendli('meta-llama-3.1-8b-instruct')
prompt: 'What is the meaning of life?',
})
Example: Using Enforcing Patterns (Regex)
Specify a specific pattern (e.g., CSV), character sets, or specific language characters (e.g., Korean Hangul characters) for your LLM's output.
import { friendli } from "@friendliai/ai-provider";
import { generateText } from "ai";
const { text } = await generateText({
model: friendli("meta-llama-3.1-8b-instruct", {
regex: "[\n ,.?!0-9\uac00-\ud7af]*",
}),
maxTokens: 40,
prompt: "who is the first king of the Joseon Dynasty?",
});
console.log(text);
Example: Using built-in tools (Beta)
If you use @friendliai/ai-provider
, you can use the built-in tools via the tools
option.
Built-in tools allow models to use tools to generate better answers. For example, a web:search
tool can provide up-to-date answers to current questions.
import { friendli } from "@friendliai/ai-provider";
import { convertToCoreMessages, streamText } from "ai";
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: friendli("meta-llama-3.1-8b-instruct", {
tools: [
{ type: "web:search" },
{ type: "math:calculator" },
{ type: "code:python-interpreter" }, // and more tools..!!
],
}),
messages: convertToCoreMessages(messages),
});
return result.toDataStreamResponse();
}
FriendliAI language models can also be used in the streamText
, generateObject
, streamObject
, and streamUI
functions.
(see AI SDK Core and AI SDK RSC).
OpenAI Compatibility
We can also use @ai-sdk/openai
with OpenAI compatibility.
import { createOpenAI } from "@ai-sdk/openai";
const friendli = createOpenAI({
baseURL: "https://api.friendli.ai/serverless/v1",
apiKey: process.env.FRIENDLI_TOKEN,
});