npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

bee-agent-framework

v0.0.33

Published

Bee - LLM Agent Framework

Downloads

2,508

Readme

The Bee Agent Framework makes it easy to build scalable agent-based workflows with your model of choice. Our default agent, Bee, is designed to perform robustly with Llama 3.1, and we're actively working on optimizing its performance with other popular LLMs. Our goal is to empower developers to adopt the latest open-source and proprietary models with minimal changes to their current agent implementation.

Key Features

  • 🤖 AI agents: Use our powerful Bee agent or build your own.
  • 🛠️ Tools: Use our built-in tools or create your own in Javascript/Python.
  • 👩‍💻 Code interpreter: Run code safely in a sandbox container.
  • 💾 Memory: Multiple strategies to optimize token spend.
  • ⏸️ Serialization Handle complex agentic workflows and easily pause/resume them without losing state.
  • 🔍 Traceability: Get full visibility of your agent’s inner workings, log all running events, and use our MLFlow integration to debug performance.
  • 🎛️ Production-level control with caching and error handling.
  • 🚧 (Coming soon) API: Configure and deploy your agents with a production-hardened API.
  • 🚧 (Coming soon) Chat UI: Serve your agent to users in a delightful GUI with built-in transparency, explainability, and user controls.
  • ... more on our Roadmap

Getting started

[!TIP]

Would you like a fully set-up TypeScript project with Bee, Code Interpreter, and Observability? Check out our Bee Framework Starter.

Installation

npm install bee-agent-framework

or

yarn add bee-agent-framework

Example

import { BeeAgent } from "bee-agent-framework/agents/bee/agent";
import { OllamaChatLLM } from "bee-agent-framework/adapters/ollama/chat";
import { TokenMemory } from "bee-agent-framework/memory/tokenMemory";
import { DuckDuckGoSearchTool } from "bee-agent-framework/tools/search/duckDuckGoSearch";
import { OpenMeteoTool } from "bee-agent-framework/tools/weather/openMeteo";

const llm = new OllamaChatLLM(); // default is llama3.1 (8B), it is recommended to use 70B model

const agent = new BeeAgent({
  llm, // for more explore 'bee-agent-framework/adapters'
  memory: new TokenMemory({ llm }), // for more explore 'bee-agent-framework/memory'
  tools: [new DuckDuckGoSearchTool(), new OpenMeteoTool()], // for more explore 'bee-agent-framework/tools'
});

const response = await agent
  .run({ prompt: "What's the current weather in Las Vegas?" })
  .observe((emitter) => {
    emitter.on("update", async ({ data, update, meta }) => {
      console.log(`Agent (${update.key}) 🤖 : `, update.value);
    });
  });

console.log(`Agent 🤖 : `, response.result.text);

To run this example, be sure that you have installed ollama with the llama3.1 model downloaded.

➡️ See a more advanced example.

➡️ All examples can be found in the examples directory.

➡️ To run an arbitrary example, use the following command yarn start examples/agents/bee.ts (just pass the appropriate path to the desired example).

Local Installation

[!NOTE]

yarn should be installed via Corepack (tutorial)

  1. Clone the repository git clone [email protected]:i-am-bee/bee-agent-framework.
  2. Install dependencies yarn install.
  3. Create .env (from .env.template) and fill in missing values (if any).
  4. Start the agent yarn run start:bee (it runs ./examples/agents/bee.ts file).

📦 Modules

The source directory (src) provides numerous modules that one can use.

| Name | Description | | ------------------------------------------------- | ------------------------------------------------------------------------------------------- | | agents | Base classes defining the common interface for agent. | | llms | Base classes defining the common interface for text inference (standard or chat). | | template | Prompt Templating system based on Mustache with various improvements. | | memory | Various types of memories to use with agent. | | tools | Tools that an agent can use. | | cache | Preset of different caching approaches that can be used together with tools. | | errors | Error classes and helpers to catch errors fast. | | adapters | Concrete implementations of given modules for different environments. | | logger | Core component for logging all actions within the framework. | | serializer | Core component for the ability to serialize/deserialize modules into the serialized format. | | version | Constants representing the framework (e.g., latest version) | | emitter | Bringing visibility to the system by emitting events. | | internals | Modules used by other modules within the framework. |

To see more in-depth explanation see overview.

Tutorials

🚧 Coming soon 🚧

Roadmap

  • Improvements to Bee agent and performance optimization with Llama3.1 and Granite model suites
  • API + UI (chat interface)
  • Python SDK
  • 🚧 TBD 🚧

Contribution guidelines

The Bee Agent Framework is an open-source project and we ❤️ contributions.

If you'd like to contribute to Bee, please take a look at our contribution guidelines.

Bugs

We are using GitHub Issues to manage our public bugs. We keep a close eye on this, so before filing a new issue, please check to make sure it hasn't already been logged.

Code of conduct

This project and everyone participating in it are governed by the Code of Conduct. By participating, you are expected to uphold this code. Please read the full text so that you can read which actions may or may not be tolerated.

Legal notice

All content in these repositories including code has been provided by IBM under the associated open source software license and IBM is under no obligation to provide enhancements, updates, or support. IBM developers produced this code as an open source project (not as an IBM product), and IBM makes no assertions as to the level of quality nor security, and will not be maintaining this code going forward.

Contributors

Special thanks to our contributors for helping us improve Bee Agent Framework.