spellbook-forge
v0.0.3
Published
Make your LLM prompts executable and version controlled by adding an API layer to a git repository.
Downloads
1
Maintainers
Readme
🪄📙🔨 Spellbook Forge
Make your LLM prompts executable and version controlled.
Quick Start
In your Express server:
yarn add spellbook-forge
import { spellbookForge } from "spellbook-forge";
const app = express()
.use(spellbookForge({
gitHost: 'https://github.com'
}))
and then:
http://localhost:3000/your/repository/prompt?execute
<-- HTTP 200
{
"prompt-content": "Complete this phrase in coders’ language: Hello …",
"model": "gpt3.5",
"result": "Hello, World!"
}
🤔 What is this?
This is an ExpressJS middleware that allows you to create an API interface for your LLM prompts. It will automatically generate a server for your prompts stored in a git repository. Using Spellbook, you can:
- Store & manage LLM prompts in a familiar tool: a git repository
- Execute prompts with chosen model and get results using a simple API
- Perform basic CRUD operations on prompts
Note: It's an early version. Expect bugs, breaking changes and poor performance.
📖 Documentation
Full documentation coming soon!
Dependencies
Prompt format
Prompts must adhere to a specific format (JSON/YAML). See more info here
Example
├── prompt1
│ ├── prompt.json
│ └── readme.md
└── collection
└── prompt2
├── prompt.yaml
└── readme.md
The above file structure will result in the following API endpoints being generated:
{host}/prompt1
{host}/collection/prompt2
Files
prompt.json
the main file with the prompt content and configuration.readme.md
additional information about prompt usage, examples etc.