Skip to content

Latest commit

 

History

History
141 lines (109 loc) · 3.46 KB

README.md

File metadata and controls

141 lines (109 loc) · 3.46 KB

Boax: A Bayesian Optimization library for JAX.

tests pypi

Overview | Installation | Getting Started | Documentation

Boax is currently in early alpha and under active development!

Overview

Boax is a composable library of core components for Bayesian Optimization that is designed for flexibility.

It comes with high-level interfaces for:

  • Experiments (boax.experiments):
    • Bayesian Optimization Setups
    • Bandit Optimization Setups
    • Search Spaces
  • Benchmarks (boax.benchmark):
    • Benchmark Functions

And with low-level interfaces for:

  • Core capabilities (boax.core):
    • Common Distributions
    • Monte-Carlo Samplers
  • Fitting a surrogate model to data (boax.core.prediction):
    • Model Functions
    • Objective Functions
  • Constructing and optimizing acquisition functions (boax.core.optimization):
    • Acquisition Functions
    • Optimizer Functions
    • Policy Functions

Installation

You can install the latest released version of Boax from PyPI via:

pip install boax

or you can install the latest development version from GitHub:

pip install git+https://github.com/Lando-L/boax.git

Basic Usage

Here is a basic example of using the Boax for hyperparamter tuning. For more details check out the docs.

  1. Setting up classification task:
  from sklearn.model_selection import train_test_split
  from sklearn.preprocessing import StandardScaler
  from sklearn.svm import SVC

  iris = load_iris()
  X = iris.data
  y = iris.target

  X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
  scaler = StandardScaler()
  X_train = scaler.fit_transform(X_train)
  X_test = scaler.transform(X_test)

  def evaluate(C, gamma):
    svc = SVC(C=C, gamma=gamma, kernel='rbf')
    svc.fit(X_train, y_train)
    return svc.score(X_test, y_test)
  1. Setting up a bayesian optimization experiment.
  from boax.experiments import optimization
  from jax import config
  config.update("jax_enable_x64", True)

  experiment = optimization(
    parameters=[
      {
        'name': 'C',
        'type': 'log_range',
        'bounds': [1, 1_000],
      },
      {
        'name': 'gamma',
        'type': 'log_range',
        'bounds': [1e-4, 1e-3],
      },
    ],
    batch_size=4,
  )
  1. Running the trial for N = 25 steps.
  step, results = None, []

  for _ in range(25):
    # Retrieve next parameterizations to evaluate
    step, parameterizations = experiment.next(step, results)

    # Evaluate parameterizations
    evaluations = [
      evaluate(**parameterization)
      for parameterization in parameterizations
    ]
    
    results = list(
        zip(parameterizations, evaluations)
    )

  # Predicted best
  experiment.best(step)

Citing Boax

To cite Boax please use the citation:

@software{boax2023github,
  author = {Lando L{\"o}per},
  title = {{B}oax: A Bayesian Optimization library for {JAX}},
  url = {https://github.com/Lando-L/boax},
  version = {0.1.4},
  year = {2023},
}

In the above bibtex entry, the version number is intended to be that from boax/version.py, and the year corresponds to the project's open-source release.