0
Your cart

Your cart is empty

Books > Science & Mathematics > Mathematics > Optimization

Buy Now

Convex Optimization with Computational Errors (Paperback, 1st ed. 2020) Loot Price: R2,698
Discovery Miles 26 980
Convex Optimization with Computational Errors (Paperback, 1st ed. 2020): Alexander J Zaslavski

Convex Optimization with Computational Errors (Paperback, 1st ed. 2020)

Alexander J Zaslavski

Series: Springer Optimization and Its Applications, 155

 (sign in to rate)
Loot Price R2,698 Discovery Miles 26 980 | Repayment Terms: R253 pm x 12*

Bookmark and Share

Expected to ship within 10 - 15 working days

The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization to engineering and economics.

General

Imprint: Springer Nature Switzerland AG
Country of origin: Switzerland
Series: Springer Optimization and Its Applications, 155
Release date: February 2021
First published: 2020
Authors: Alexander J Zaslavski
Dimensions: 235 x 155mm (L x W)
Format: Paperback
Pages: 360
Edition: 1st ed. 2020
ISBN-13: 978-3-03-037824-0
Categories: Books > Science & Mathematics > Mathematics > Numerical analysis
Books > Science & Mathematics > Mathematics > Calculus & mathematical analysis > Calculus of variations
Books > Science & Mathematics > Mathematics > Optimization > General
Promotions
LSN: 3-03-037824-1
Barcode: 9783030378240

Is the information for this product incomplete, wrong or inappropriate? Let us know about it.

Does this product have an incorrect or missing image? Send us a new image.

Is this product missing categories? Add more categories.

Review This Product

No reviews yet - be the first to create one!

You might also like..

BI Statistical Methods - Volume I…
Peter Walley Hardcover R2,756 Discovery Miles 27 560
Transnational Cooperation - An…
Clint Peinhardt, Todd Sandler Hardcover R3,708 Discovery Miles 37 080
Continental Adventures
Charlotte Anne Eaton Paperback R566 Discovery Miles 5 660
Continental Adventures
Charlotte Anne Eaton Paperback R523 Discovery Miles 5 230
The History and Allure of Interactive…
Mark Kretzschmar, Sara Raffel Hardcover R3,115 Discovery Miles 31 150
Comparing Fairness - Relative Criteria…
Roger A. McCain Hardcover R2,952 Discovery Miles 29 520
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, … Hardcover R6,583 Discovery Miles 65 830
Learn Game Theory
Albert Rutherford Hardcover R697 Discovery Miles 6 970
Motion Control: Multi-faceted Movement…
Yoram Baram Hardcover R3,419 Discovery Miles 34 190
Sparse Polynomial Optimization: Theory…
Victor Magron, Jie Wang Hardcover R2,346 Discovery Miles 23 460
Convex Optimization for Machine Learning
Changho Suh Hardcover R3,239 Discovery Miles 32 390
Computational Optimization Techniques…
Muhammad Sarfraz, Samsul Ariffin Abdul Karim Hardcover R3,424 Discovery Miles 34 240

See more

Partners