vellum-ai
v0.12.4
Published
[![npm shield](https://img.shields.io/npm/v/vellum-ai)](https://www.npmjs.com/package/vellum-ai) ![license badge](https://img.shields.io/github/license/vellum-ai/vellum-client-node) [![fern shield](https://img.shields.io/badge/%F0%9F%8C%BF-SDK%20generated
Downloads
34,620
Readme
Vellum Node Library
The Vellum Node SDK provides access to the Vellum API from JavaScript/TypeScript in node environments.
Note: This SDK is not intended to work in client-side JavaScript environments (i.e. web browsers).
API Docs
You can find Vellum's complete API docs at docs.vellum.ai.
Installation
npm install --save vellum-ai
# or
yarn add vellum-ai
Usage
import { VellumClient } from 'vellum-ai';
const vellum = new VellumClient({
apiKey: "<YOUR_API_KEY>",
});
void main();
async function main() {
const result = await vellum.executePrompt({
promptDeploymentName: "<your-deployment-name>>",
releaseTag: "LATEST",
inputs: [
{
type: "STRING",
name: "<input_name>",
value: "<example-string-value>",
},
],
});
if (result.state === "REJECTED") {
throw new Error(result.error.message)
} else if (result.state === "FULFILLED") {
console.log(result.outputs[0].value);
}
}
[!TIP] You can set a system environment variable
VELLUM_API_KEY
to avoid writing your api key within your code. To do so, addexport VELLUM_API_KEY=<your-api-token>
to your ~/.zshrc or ~/.bashrc, open a new terminal, and then any code callingVellumClient()
will read this key.
Contributing
While we value open-source contributions to this SDK, most of this library is generated programmatically.
Please feel free to make contributions to any of the directories or files below:
tests/*
README.md
Any additions made to files beyond those directories and files above would have to be moved over to our generation code (found in the separate vellum-client-generator repo), otherwise they would be overwritten upon the next generated release. Feel free to open a PR as a proof of concept, but know that we will not be able to merge it as-is. We suggest opening an issue first to discuss with us!