Skip to content

Full Bayesian approach to hyperparameters #8

@platawiec

Description

@platawiec

Using the full posterior distribution with the hyperparameters as unknown variables is known to give better results in Bayesian optimization (see https://arxiv.org/pdf/1206.2944.pdf).

A user could opt-in to using this technique by replacing the MAPGPOptimizer with MCMCEstimate or another appropriate name. GaussianProcesses provides an mcmc function to estimate hyperparameters but my understanding of the source is that it does not marginalize over the hyperparameters and compute an integrated acquisition function (which I suppose wouldn't make sense within the scope of GaussianProcesses).

Thoughts on including something like this? The way I see it, the work would break down as follows:

  • Include benchmarks
  • Decide on MCMC implementation (do we introduce additional dependencies, write in-line code, etc.)
  • Decide on interface for various acquisition functions under MCMCEstimate
  • Implement prototype
  • Compare against tests/benchmarks

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions