Books > Science & Mathematics > Mathematics > Optimization
|
Buy Now
Convex Optimization with Computational Errors (Paperback, 1st ed. 2020)
Loot Price: R2,698
Discovery Miles 26 980
|
|
Convex Optimization with Computational Errors (Paperback, 1st ed. 2020)
Series: Springer Optimization and Its Applications, 155
Expected to ship within 10 - 15 working days
|
The book is devoted to the study of approximate solutions of
optimization problems in the presence of computational errors. It
contains a number of results on the convergence behavior of
algorithms in a Hilbert space, which are known as important tools
for solving optimization problems. The research presented in the
book is the continuation and the further development of the
author's (c) 2016 book Numerical Optimization with Computational
Errors, Springer 2016. Both books study the algorithms taking into
account computational errors which are always present in practice.
The main goal is, for a known computational error, to find out what
an approximate solution can be obtained and how many iterates one
needs for this. The main difference between this new book and the
2016 book is that in this present book the discussion takes into
consideration the fact that for every algorithm, its iteration
consists of several steps and that computational errors for
different steps are generally, different. This fact, which was not
taken into account in the previous book, is indeed important in
practice. For example, the subgradient projection algorithm
consists of two steps. The first step is a calculation of a
subgradient of the objective function while in the second one we
calculate a projection on the feasible set. In each of these two
steps there is a computational error and these two computational
errors are different in general. It may happen that the feasible
set is simple and the objective function is complicated. As a
result, the computational error, made when one calculates the
projection, is essentially smaller than the computational error of
the calculation of the subgradient. Clearly, an opposite case is
possible too. Another feature of this book is a study of a number
of important algorithms which appeared recently in the literature
and which are not discussed in the previous book. This monograph
contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we
study the subgradient projection algorithm for minimization of
convex and nonsmooth functions. We generalize the results of [NOCE]
and establish results which has no prototype in [NOCE]. In Chapter
3 we analyze the mirror descent algorithm for minimization of
convex and nonsmooth functions, under the presence of computational
errors. For this algorithm each iteration consists of two steps.
The first step is a calculation of a subgradient of the objective
function while in the second one we solve an auxiliary minimization
problem on the set of feasible points. In each of these two steps
there is a computational error. We generalize the results of [NOCE]
and establish results which has no prototype in [NOCE]. In Chapter
4 we analyze the projected gradient algorithm with a smooth
objective function under the presence of computational errors. In
Chapter 5 we consider an algorithm, which is an extension of the
projection gradient algorithm used for solving linear inverse
problems arising in signal/image processing. In Chapter 6 we study
continuous subgradient method and continuous subgradient projection
algorithm for minimization of convex nonsmooth functions and for
computing the saddle points of convex-concave functions, under the
presence of computational errors. All the results of this chapter
has no prototype in [NOCE]. In Chapters 7-12 we analyze several
algorithms under the presence of computational errors which were
not considered in [NOCE]. Again, each step of an iteration has a
computational errors and we take into account that these errors
are, in general, different. An optimization problems with a
composite objective function is studied in Chapter 7. A zero-sum
game with two-players is considered in Chapter 8. A predicted
decrease approximation-based method is used in Chapter 9 for
constrained convex optimization. Chapter 10 is devoted to
minimization of quasiconvex functions. Minimization of sharp weakly
convex functions is discussed in Chapter 11. Chapter 12 is devoted
to a generalized projected subgradient method for minimization of a
convex function over a set which is not necessarily convex. The
book is of interest for researchers and engineers working in
optimization. It also can be useful in preparation courses for
graduate students. The main feature of the book which appeals
specifically to this audience is the study of the influence of
computational errors for several important optimization algorithms.
The book is of interest for experts in applications of optimization
to engineering and economics.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!
|
You might also like..
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.