node-bitnet-cpp
v0.1.0
Published
Node.js wrapper for BitNet C++ implementation
Downloads
43
Readme
node-bitnet-cpp
A Node.js wrapper for the BitNet C++ implementation.
⚠️ Warning: This package is very new and could be evolving. The API will keep changing, and the usage might change if needed. We will update to version 1.0.0 when stable and has a reliable build process (for now the build process has to be run separately using python). But, feel free to use it as is. This notice is temporary and will be updated later.
We are considering adding a command to help build and convert the model as well as to compile the C++ code, similar to the official BitNet repository.
This project is inspired by and builds upon the work of the Microsoft BitNet team.
Installation
npm install node-bitnet-cpp
Building from source
To build the project from source, follow these steps:
Clone the repository:
git clone https://github.com/SouthBridgeAI/node-bitnet-cpp.git cd node-bitnet-cpp
Install dependencies:
bun install
Build the project:
bun run build
Usage
First, ensure you have the BitNet C++ executable and model file available.
import BitNet from 'node-bitnet-cpp';
const bitnet = new BitNet('/path/to/bitnet/executable', '/path/to/model.gguf');
// 1. Awaitable Promise
async function runInference() {
try {
const result = await bitnet.runInference("What is the capital of France?", {
nPredict: 128,
threads: 2,
ctxSize: 2048,
temperature: 0.8
});
console.log(result);
} catch (error) {
console.error('Error:', error);
}
}
// 2. Async Generator
async function streamInference() {
try {
for await (const chunk of bitnet.streamInference("Tell me a story about a robot.")) {
console.log(chunk);
}
} catch (error) {
console.error('Error:', error);
}
}
// 3. Node Stream
function useStream() {
const stream = bitnet.createStream("Explain quantum computing.");
stream.pipe(process.stdout);
}
runInference();
streamInference();
useStream();
API
new BitNet(execPath: string, modelPath: string)
Creates a new BitNet instance.
execPath
: Path to the BitNet C++ executablemodelPath
: Path to the BitNet model file
runInference(prompt: string, options?: BitNetOptions): Promise<string>
Runs inference and returns the result as a promise.
streamInference(prompt: string, options?: BitNetOptions): AsyncGenerator<string>
Runs inference and yields results as they become available.
createStream(prompt: string, options?: BitNetOptions): Readable
Creates a readable stream of the inference results.
BitNetOptions
nPredict
: Number of tokens to predict (default: 128)threads
: Number of threads to use (default: 2)ctxSize
: Context size (default: 2048)temperature
: Temperature for sampling (default: 0.8)
Contributing
Contributions, issues, and feature requests are welcome! Feel free to check issues page if you want to contribute.
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.