|
Showing 1 - 3 of
3 matches in All Departments
This comprehensive textbook presents a clean and coherent account
of most fundamental tools and techniques in Parameterized
Algorithms and is a self-contained guide to the area. The book
covers many of the recent developments of the field, including
application of important separators, branching based on linear
programming, Cut & Count to obtain faster algorithms on tree
decompositions, algorithms based on representative families of
matroids, and use of the Strong Exponential Time Hypothesis. A
number of older results are revisited and explained in a modern and
didactic way. The book provides a toolbox of algorithmic
techniques. Part I is an overview of basic techniques, each chapter
discussing a certain algorithmic paradigm. The material covered in
this part can be used for an introductory course on fixed-parameter
tractability. Part II discusses more advanced and specialized
algorithmic ideas, bringing the reader to the cutting edge of
current research. Part III presents complexity results and lower
bounds, giving negative evidence by way of W[1]-hardness, the
Exponential Time Hypothesis, and kernelization lower bounds. All
the results and concepts are introduced at a level accessible to
graduate students and advanced undergraduate students. Every
chapter is accompanied by exercises, many with hints, while the
bibliographic notes point to original publications and related
work.
This comprehensive textbook presents a clean and coherent account
of most fundamental tools and techniques in Parameterized
Algorithms and is a self-contained guide to the area. The book
covers many of the recent developments of the field, including
application of important separators, branching based on linear
programming, Cut & Count to obtain faster algorithms on tree
decompositions, algorithms based on representative families of
matroids, and use of the Strong Exponential Time Hypothesis. A
number of older results are revisited and explained in a modern and
didactic way. The book provides a toolbox of algorithmic
techniques. Part I is an overview of basic techniques, each chapter
discussing a certain algorithmic paradigm. The material covered in
this part can be used for an introductory course on fixed-parameter
tractability. Part II discusses more advanced and specialized
algorithmic ideas, bringing the reader to the cutting edge of
current research. Part III presents complexity results and lower
bounds, giving negative evidence by way of W[1]-hardness, the
Exponential Time Hypothesis, and kernelization lower bounds. All
the results and concepts are introduced at a level accessible to
graduate students and advanced undergraduate students. Every
chapter is accompanied by exercises, many with hints, while the
bibliographic notes point to original publications and related
work.
Preprocessing, or data reduction, is a standard technique for
simplifying and speeding up computation. Written by a team of
experts in the field, this book introduces a rapidly developing
area of preprocessing analysis known as kernelization. The authors
provide an overview of basic methods and important results, with
accessible explanations of the most recent advances in the area,
such as meta-kernelization, representative sets, polynomial lower
bounds, and lossy kernelization. The text is divided into four
parts, which cover the different theoretical aspects of the area:
upper bounds, meta-theorems, lower bounds, and beyond
kernelization. The methods are demonstrated through extensive
examples using a single data set. Written to be self-contained, the
book only requires a basic background in algorithmics and will be
of use to professionals, researchers and graduate students in
theoretical computer science, optimization, combinatorics, and
related fields.
|
|