From Bandits to Monte-Carlo Tree Search covers several aspects of
the ""optimism in the face of uncertainty"" principle for large
scale optimization problems under finite numerical budget. The
monograph's initial motivation came from the empirical success of
the so-called ""Monte-Carlo Tree Search"" method popularized in
Computer Go and further extended to many other games as well as
optimization and planning problems. It lays out the theoretical
foundations of the field by characterizing the complexity of the
optimization problems and designing efficient algorithms with
performance guarantees. The main direction followed in this
monograph consists in decomposing a complex decision making problem
(such as an optimization problem in a large search space) into a
sequence of elementary decisions, where each decision of the
sequence is solved using a stochastic ""multi-armed bandit""
(mathematical model for decision making in stochastic
environments). This defines a hierarchical search which possesses
the nice feature of starting the exploration by a quasi-uniform
sampling of the space and then focusing, at different scales, on
the most promising areas (using the optimistic principle) until
eventually performing a local search around the global optima of
the function. This monograph considers the problem of function
optimization in general search spaces (such as metric spaces,
structured spaces, trees, and graphs) as well as the problem of
planning in Markov decision processes. Its main contribution is a
class of hierarchical optimistic algorithms with different
algorithmic instantiations depending on whether the evaluations are
noisy or noiseless and whether some measure of the local
''smoothness'' of the function around the global maximum is known
or unknown.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!