@flwr/flwr
v0.1.6
Published
Flower Intelligence: Open-Source On-Device AI with optional Confidential Remote Compute.
Downloads
583
Readme
Flower Intelligence
Check out the full documentation here, and the project website here.
Install
To install via NPM, run:
npm i @flwr/flwr
Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using ES Modules, you can import the library with:
<script type="module">
import { FlowerIntelligence } from 'https://cdn.jsdelivr.net/npm/@flwr/flwr';
</script>
Hello, Flower Intelligence!
import { FlowerIntelligence } from '@flwr/flwr';
const fi = FlowerIntelligence.instance;
async function main() {
const response = await fi.chat({
model: 'meta/llama3.2-1b/instruct-fp16',
messages: [
{ role: 'system', content: 'You are a helpful assistant' },
{ role: 'user', content: 'How are you?' },
],
});
if (!response.ok) {
console.error(`${response.failure.code}: ${response.failure.description}`);
} else {
console.log(response.message.content);
}
}
await main().then().catch();
Demo
You can also quickly try out the library with the examples/hello-world-ts
example (which is a minimal TypeScript project):
git clone --depth=1 https://github.com/adap/flower.git _tmp && \
mv _tmp/intelligence/ts/examples/hello-world-ts . && \
rm -rf _tmp && \
cd hello-world-ts
npm i
npm run build
npm run start
You'll find a list of other examples here.