### Tim Genewein

Information-optimal hierarchies for inference and decision-making

Bosch Center for Artificial Intelligence
Stuttgart - Renningen, Germany

Email Scholar Github Contact Subscribe

# Paper presented @ NIPS: A Nonparametric Conjugate Prior Distribution for the Maximizing Argument of a Noisy Function

Ortega, P.A., Grau-Moya, J., Genewein, T., Balduzzi, D., Braun, D.A. (2012) A Nonparametric Conjugate Prior Distribution for the Maximizing Argument of a Noisy Function. Neural Information Processing Systems (NIPS) 2012

Our paper on a nonparametric conjugate prior distribution for maximizing a noisy function was accepted and presented at NIPS 2012 in Lake Tahoe.

Many approaches to finding the maximizing argument of a noisy function use a parametric or nonparametric model for the underlying (noise-free) function. Observations are then used to update the parameters of the model (commonly Bayesian inference is performed over the parameters). The model of the underlying function is then used to find new test-points to gather more information - the most simple approach is simply taking the maximum of the modeled function and evaluating the true (noisy) function with this argument. More sophisticated implementations include some kind of exploration ($\epsilon$-greedy, picking points where uncertainty is highest like UCB or max expected information gain).
In our paper we do not attempt to model the underlying function but rather to model our belief over the location of the maximum (the argument) directly. This belief is modeled in a nonparametric fashion and updated in light of new observations in a Bayesian way - leading to a nonparametric conjugate prior over the “location” of the maximum of the noisy function (that is the maximizing argument).

Find a very nice and more detailed explanation by Pedro Ortega on adaptiveagents.org (including MATLAB code).