npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@rdu/orchts

v0.3.7

Published

An orchestration framework for Large Language Models (LLM) with TypeScript support

Downloads

47

Readme

OrchTS

OrchTS is an experimental TypeScript framework for orchestrating Large Language Models (LLM), heavily inspired by OpenAI's Python-based Swarm framework. It provides a simple, lightweight approach to building LLM-powered applications with minimal boilerplate code.

image

Status: Experimental

This project is in an experimental state and far from complete. It's meant to serve as a foundation for exploring LLM orchestration patterns in TypeScript. While it's functional and usable, expect changes and improvements as the project evolves.

Key Features

  • Simple Architecture: Follows Swarm's philosophy of minimalism and clarity
  • TypeScript Native: Built from the ground up with TypeScript, providing full type safety and IDE support
  • Provider Agnostic: Extensible LLMProvider interface (currently implemented for OpenAI and Ollama)
  • Function Calling: Support for LLM function calling with type-safe decorators
  • Agent Communication: Flexible agent-to-agent communication with optional message history transfer
  • Minimal Boilerplate: Get started quickly with minimal configuration

Installation

npm install @rdu/orchts

For Ollama support, additionally install:

npm install ollama

Project Setup

  1. Create a new project and initialize it:
mkdir my-orchts-project
cd my-orchts-project
npm init -y
  1. Install the required dependencies. You can choose between two approaches:

Using tsx (recommended):

npm install @rdu/orchts
npm install typescript @types/node tsx --save-dev

Using ts-node:

npm install @rdu/orchts
npm install typescript @types/node ts-node --save-dev
  1. Create tsconfig.json:
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "esModuleInterop": true,
    "forceConsistentCasingInFileNames": true,
    "strict": true,
    "skipLibCheck": true,
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true
  }
}
  1. Update your package.json:
{
  "type": "module",
  "scripts": {
    // If using tsx (recommended):
    "start": "tsx your-script.ts"
    // OR if using ts-node:
    "start": "NODE_OPTIONS=\"--loader ts-node/esm\" node your-script.ts"
  }
}

Quick Start

Create a new file (e.g., test.ts):

import { Agent, OrchTS, type Message, OllamaProvider } from '@rdu/orchts';

const run = async () => {
    // Initialize with Ollama provider
    const client = new OrchTS({
        provider: new OllamaProvider('mistral') // or any other Ollama model
    });

    const agent = new Agent({
        name: 'SimpleAgent',
        instructions: "You are a helpful agent",
    });

    const messages: Message[] = [{ 
        role: "user", 
        content: "What's the weather like?" 
    }];

    const response = await client.run({
        agent: agent,
        messages: messages
    });

    console.log(response.messages[response.messages.length - 1].content);
};

run().catch(console.error);

For more examples, check out the examples directory.

Architecture

OrchTS is built around three main concepts:

  1. Agents: Entities that can process messages and make decisions
  2. Functions: Type-safe function calling using decorators
  3. LLMProviders: Abstraction layer for different LLM services (OpenAI, Ollama)

Example with Function Calling (Ollama)

import { Agent, AgentFunction, FunctionBase, OrchTS, OllamaProvider } from '@rdu/orchts';

class WeatherFunctions extends FunctionBase {
    @AgentFunction("Get the current weather")
    getWeather(city: string): string {
        return `The weather in ${city} is sunny`;
    }
}

const weatherFunctions = new WeatherFunctions();
const provider = new OllamaProvider('mistral');

const agent = new Agent({
    name: 'WeatherAgent',
    instructions: "You help with weather information",
    functions: [weatherFunctions.getWeather],
});

const client = new OrchTS({ provider });

LLM Providers

OrchTS supports multiple LLM providers:

OpenAI Provider

The default provider, using OpenAI's API.

Ollama Provider

Local LLM provider using Ollama. Features include:

  • Support for various Ollama models (mistral, llama, etc.)
  • Function calling capabilities
  • Custom host configuration
  • Message transformation handling

Example configuration:

const provider = new OllamaProvider('mistral', 'http://localhost:11434');
const client = new OrchTS({ provider });

Requirements

  • Node.js ≥ 18.0.0
  • TypeScript ≥ 4.8.0
  • Experimental decorators enabled in TypeScript config
  • ESM module system

Contributing

Contributions are very welcome! This project is meant to be a collaborative effort to explore and improve LLM orchestration patterns.

Areas for Contribution

  • Additional LLM providers
  • Bug fixes and improvements
  • Documentation enhancements
  • Examples and use cases

How to Contribute

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Open a Pull Request

For bug reports and feature requests, please use GitHub Issues.

Relationship to OpenAI's Swarm

OrchTS is heavily inspired by OpenAI's Swarm framework and follows many of its design principles. However, it's reimplemented in TypeScript and includes some additional features:

  • TypeScript-first implementation with full type safety
  • Provider-agnostic design through the LLMProvider interface
  • Enhanced message history handling in agent transfers

License

MIT

Acknowledgments

  • OpenAI's Swarm framework for the inspiration and architecture patterns
  • All contributors who help improve this experimental framework

Note: This is an experimental project and should be used accordingly. While functional, it's still evolving and may undergo significant changes.