npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

metaprog

v2.0.3

Published

An experimental and versatile library for exploring LLM-assisted metaprogramming.

Downloads

357

Readme

Contributors Forks Stargazers Issues

About the Project

Metaprog is an AI metaprogramming library for TypeScript that enables you to generate, validate and test code using LLMs on runtime. It provides a simple yet powerful builder API to describe the code you want to generate and automatically handles the interaction with LLMs, validation of the output, and testing of the generated code.

Key Features

  • On-demand function generation based on a function description
  • Integration with LLMs from the LangChain ecosystem
  • Automatic caching of generated functions to avoid re-generation
  • Automated test and re-prompt process if a generated function fails a user-supplied test case
  • Strong type-safety and flexible configuration for input and output schemas using Zod

Getting Started

Installation

You'll need to install the Metaprog package, as well as LangChain and the LLM-specific package you want to use. For the rest of the guide, we'll use Anthropic's Claude 3.5 Sonnet model.

npm install metaprog @langchain/core @langchain/anthropic # or any other LLM provider

# or

pnpm add metaprog @langchain/core @langchain/anthropic

# or

yarn add metaprog @langchain/core @langchain/anthropic

Basic Usage

Below is a simple (and extremely overkill) example demonstrating how to generate a function that logs "Hello world!" to the console.

import { MetaprogFunctionBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';

const model = new ChatAnthropic({
  model: 'claude-3-5-sonnet-latest',
  apiKey: 'your_api_key_here',
});

const builder = new MetaprogFunctionBuilder('Console log "Hello world!"', {
  model,
});

const func = await builder.build();

func(); // logs "Hello world!"

How It Works

  1. You provide a textual description of what the function should do.
  2. Metaprog sends this description (and optional schemas for input or output) to an LLM.
  3. The LLM returns TypeScript code, which is then compiled and cached locally.
  4. You can immediately invoke the compiled function within your application.
  5. On subsequent runs, Metaprog checks the cache to avoid re-generation.

Using Schemas for Validation

To further constrain or validate your function's input and output, you can provide Zod schemas. This will be used on the generation process as well as to strictly type the built function.

import { z } from 'zod';
import { MetaprogFunctionBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';

const model = new ChatAnthropic({
  model: 'claude-3-5-sonnet-latest',
  apiKey: 'your_api_key_here',
});

// Define input/output Zod schemas
const inputSchema = [
  z.array(z.array(z.number())).describe('Adjacency matrix'),
  z.number().describe('Start node'),
  z.number().describe('End node'),
];
const outputSchema = z.number().describe('Shortest path length');

const pathFinderBuilder = new MetaprogFunctionBuilder(
  'Get shortest path between two nodes on a graph given an adjacency matrix, a start node, and an end node.',
  {
    model,
    inputSchema,
    outputSchema,
  },
);

const findPathLength = await pathFinderBuilder.build();
// ^? (adjacencyMatrix: number[][], startNode: number, endNode: number) => number

findPathLength(
  [
    [0, 1, 7],
    [1, 2, 3],
    [5, 3, 4],
  ],
  0,
  2,
); // 4

Advanced Usage

Automatic Testing and Regeneration

Metaprog can automatically run a test against the generated function. If the function fails, it will ask the LLM to fix the generated code and retry until it passes (up to a configurable number of retries).

import { z } from 'zod';
import { MetaprogFunctionBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';

const model = new ChatAnthropic({
  model: 'claude-3-5-sonnet-latest',
  apiKey: 'your_api_key_here',
});

const addStrings = await new MetaprogFunctionBuilder('Add two numbers', {
  model,
})
  .test((f) => f('1', '2') === 3) // If not passed, retries generation
  .test((f) => f('-5', '15') === 10) // If not passed, retries generation
  .build();

addStrings('1', '2'); // This result is ensured to be 3 as per the test

Caching

All generated functions are cached so that on subsequent runs, the same function doesn't need to be re-generated unnecesarily. This reduces both latency and usage quotas on your LLM. By default, files are stored under a "generated" folder, and metadata is stored in a JSON file.

Custom Cache Handler

If you want more control over how or where functions are stored, implement the CacheHandler interface:

import { CacheHandler } from 'metaprog';

class MyCustomCacheHandler implements CacheHandler {
  // Your cache handler code
}

// Then provide it to MetaprogFunctionBuilder:
import { MetaprogFunctionBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';

const model = new ChatAnthropic({
  model: 'claude-3-5-sonnet-latest',
  apiKey: 'your_api_key_here',
});

const myCustomCache = new MyCustomCacheHandler();

const myFunc = new MetaprogFunctionBuilder(
  'Some descriptive text',
  { model },
  myCustomCache,
);

Contributing

Contributions are welcome! Feel free to submit issues or PRs on GitHub if you find bugs or want to propose new features.

License

This project is licensed under the MIT License. See the LICENSE file for details.