|
Showing 1 - 1 of
1 matches in All Departments
This text provides a detailed and self-contained introduction to
the core topics of optimal control for finite-dimensional
deterministic dynamical systems. Skillfully designed to
guide the student through the development of the subject, the book
provides a rich collection of examples, exercises, illustrations,
and applications, to support comprehension of the material.
Solutions to odd-numbered exercises are included, while a complete
set of solutions is available to instructors who adopt the text for
their class. The book is adaptable to coursework for final year
undergraduates in (applied) mathematics or beginning graduate
students in engineering. Required mathematical background includes
calculus, linear algebra, a basic knowledge of differential
equations, as well as a rudimentary acquaintance with control
systems.  The book has developed out of lecture notes
that were tested, adapted, and expanded over many years of
teaching. Chapters 1-4 constitute the material for a basic
course on optimal control, covering successively the calculus of
variations, minimum principle, dynamic programming, and linear
quadratic control. The additional Chapter 5 provides brief
views to a number of selected topics related to optimal control,
which are meant to peak the reader’s interest. Some mathematical
background is summarized in Appendix A for easy review. Appendix B
recalls some of the basics of differential equations and also
provides a detailed treatment of Lyapunov stability theory
including LaSalle’s invariance principle, as occasionally used in
Chapters 3 and 4.
|
You may like...
Barbie
Margot Robbie, Ryan Gosling
Blu-ray disc
R266
Discovery Miles 2 660
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.