Books > Science & Mathematics > Mathematics > Optimization
|
Buy Now
Optimization in Banach Spaces (Paperback, 1st ed. 2022)
Loot Price: R1,423
Discovery Miles 14 230
|
|
Optimization in Banach Spaces (Paperback, 1st ed. 2022)
Series: SpringerBriefs in Optimization
Expected to ship within 10 - 15 working days
|
The book is devoted to the study of constrained minimization
problems on closed and convex sets in Banach spaces with a Frechet
differentiable objective function. Such problems are well studied
in a finite-dimensional space and in an infinite-dimensional
Hilbert space. When the space is Hilbert there are many algorithms
for solving optimization problems including the gradient projection
algorithm which is one of the most important tools in the
optimization theory, nonlinear analysis and their applications. An
optimization problem is described by an objective function and a
set of feasible points. For the gradient projection algorithm each
iteration consists of two steps. The first step is a calculation of
a gradient of the objective function while in the second one we
calculate a projection on the feasible set. In each of these two
steps there is a computational error. In our recent research we
show that the gradient projection algorithm generates a good
approximate solution, if all the computational errors are bounded
from above by a small positive constant. It should be mentioned
that the properties of a Hilbert space play an important role. When
we consider an optimization problem in a general Banach space the
situation becomes more difficult and less understood. On the other
hand such problems arise in the approximation theory. The book is
of interest for mathematicians working in optimization. It also can
be useful in preparation courses for graduate students. The main
feature of the book which appeals specifically to this audience is
the study of algorithms for convex and nonconvex minimization
problems in a general Banach space. The book is of interest for
experts in applications of optimization to the approximation
theory. In this book the goal is to obtain a good approximate
solution of the constrained optimization problem in a general
Banach space under the presence of computational errors. It is
shown that the algorithm generates a good approximate solution, if
the sequence of computational errors is bounded from above by a
small constant. The book consists of four chapters. In the first we
discuss several algorithms which are studied in the book and prove
a convergence result for an unconstrained problem which is a
prototype of our results for the constrained problem. In Chapter 2
we analyze convex optimization problems. Nonconvex optimization
problems are studied in Chapter 3. In Chapter 4 we study continuous
algorithms for minimization problems under the presence of
computational errors. The algorithm generates a good approximate
solution, if the sequence of computational errors is bounded from
above by a small constant. The book consists of four chapters. In
the first we discuss several algorithms which are studied in the
book and prove a convergence result for an unconstrained problem
which is a prototype of our results for the constrained problem. In
Chapter 2 we analyze convex optimization problems. Nonconvex
optimization problems are studied in Chapter 3. In Chapter 4 we
study continuous algorithms for minimization problems under the
presence of computational errors.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!
|
You might also like..
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.