npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

bayesian-optimizer

v1.2.0

Published

Bayesian optimizer for multi-dimensional search space

Downloads

6

Readme

Bayesian Optimizer is a JavaScript library for optimizing black-box functions using Bayesian optimization with Gaussian processes.

Features

  • Supports multi-dimensional input spaces.
  • Adjustable optimization parameters (exploration, number of candidates, etc.).
  • Gaussian process regression with the Matérn kernel.
  • Expected Improvement acquisition function
  • 0 dependencies
  • Type definitions included

Installation

Install the package using npm:

npm install bayesian-optimizer

or yarn:

yarn add bayesian-optimizer

Usage

import { BayesianOptimizer } from "bayesian-optimizer";

// Define your objective function
const objectiveFunction = async (params) => {
  // Your objective function logic here
  // Example: return -(params.x ** 2 + params.y ** 2);
};

// Define the search space for the objective function
const searchSpace = {
  x: { min: -5, max: 5 },
  y: { min: -5, max: 5 },
};

// Initialize the optimizer
const optimizer = new BayesianOptimizer({
  exploration: 0.1, // Optional, default is 0.01
  numCandidates: 100, // Optional, default is 100
  kernelNu: 1.5, // Optional, default is 1.5
  kernelLengthScale: 1.0, // Optional, default is 1.0
});

// Optimize the objective function
await optimizer.optimize(objectiveFunction, searchSpace, 100);

// Get the best parameters found
const bestParams = optimizer.getBestParams();

API

BayesianOptimizer

The main class for performing Bayesian optimization.

constructor(options?: { exploration?: number; numCandidates?: number; kernelNu?: number; kernelLengthScale?: number })

Create a new instance of the BayesianOptimizer.

  • options: An optional object with the following properties:
    • exploration: The exploration parameter (xi) for the Expected Improvement acquisition function. Default is 0.01. Controls the exploration-exploitation trade-off.
    • numCandidates: The number of candidates sampled for each optimization step. Default is 100.
    • kernelNu: Controls the smoothness of the Squared Exponential kernel. Default is 1.5.
    • kernelLengthScale: Controls the length scale of the Squared Exponential kernel. Default is 1.0.

optimize(objectiveFunction: ObjectiveFunction, searchSpace: { [key: string]: ParameterRange }, numSteps:number): void

Optimize the given objective function over the specified search space for a certain number of steps.

  • objectiveFunction: The function to optimize.
  • searchSpace: An object that defines the ranges of the parameters for the objective function.
  • numSteps: The number of steps to perform the optimization.

getBestParams(): { [key: string]: number } | null

Returns the best parameters found during the optimization.

Implementation details

Gaussian Process and Matérn Kernel

Bayesian optimization relies on Gaussian process regression, which is a powerful technique for modeling an unknown function using a set of observed data points. In this library, we use the Matérn kernel as the covariance function for the Gaussian process. The Matérn kernel is a popular choice in Bayesian optimization due to its flexibility and ability to model various degrees of smoothness in the underlying function.

The Matérn kernel has two parameters, ν (nu) and l (length scale), which can be adjusted to control the smoothness and scale of the function being modeled. By default, this library uses a ν of 2.5 and a length scale of 1. These default values can be overridden by providing the kernelNu and kernelLengthScale options when initializing the BayesianOptimizer.

Acquisition Function

In Bayesian optimization, the acquisition function is used to determine which points in the search space should be evaluated next. The most commonly used acquisition function is Expected Improvement (EI), which balances exploration and exploitation by calculating the expected improvement of a potential candidate point over the current best point.

The EI acquisition function is a popular strategy in Bayesian optimization that balances exploration and exploitation by selecting the next point to evaluate based on the expected improvement over the current best point. High EI values indicate a higher potential for improvement, guiding the optimizer towards promising regions of the search space.

Author

Contributing

Contributions are welcome! Please open an issue or submit a pull request on the GitHub repository.

Possible Expansions

There are several possible expansions that could be added to to this library. One possible expansion is the addition of other acquisition functions, such as Probability of Improvement (PI) or Upper Confidence Bound (UCB), which are also commonly used in Bayesian optimization. Another possible expansion is the inclusion of other kernel functions, which can affect the smoothness and scale of the function being modeled. Adding more kernel functions would increase the flexibility of the library and allow users to model a wider range of functions.

License

MIT License