|
Showing 1 - 2 of
2 matches in All Departments
Optimal Impulsive Control explores the class of impulsive dynamic
optimization problems-problems that stem from the fact that many
conventional optimal control problems do not have a solution in the
classical setting-which is highly relevant with regard to
engineering applications. The absence of a classical solution
naturally invokes the so-called extension, or relaxation, of a
problem, and leads to the notion of generalized solution which
encompasses the notions of generalized control and trajectory; in
this book several extensions of optimal control problems are
considered within the framework of optimal impulsive control
theory. In this framework, the feasible arcs are permitted to have
jumps, while the conventional absolutely continuous trajectories
may fail to exist. The authors draw together various types of their
own results, centered on the necessary conditions of optimality in
the form of Pontryagin's maximum principle and the existence
theorems, which shape a substantial body of optimal impulsive
control theory. At the same time, they present optimal impulsive
control theory in a unified framework, introducing the different
paradigmatic problems in increasing order of complexity. The
rationale underlying the book involves addressing extensions
increasing in complexity from the simplest case provided by linear
control systems and ending with the most general case of a totally
nonlinear differential control system with state constraints. The
mathematical models presented in Optimal Impulsive Control being
encountered in various engineering applications, this book will be
of interest to both academic researchers and practising engineers.
This book deals with optimization methods as tools for decision
making and control in the presence of model uncertainty. It is
oriented to the use of these tools in engineering, specifically in
automatic control design with all its components: analysis of
dynamical systems, identification problems, and feedback control
design. Developments in Model-Based Optimization and Control takes
advantage of optimization-based formulations for such classical
feedback design objectives as stability, performance and
feasibility, afforded by the established body of results and
methodologies constituting optimal control theory. It makes
particular use of the popular formulation known as predictive
control or receding-horizon optimization. The individual
contributions in this volume are wide-ranging in subject matter but
coordinated within a five-part structure covering material on: *
complexity and structure in model predictive control (MPC); *
collaborative MPC; * distributed MPC; * optimization-based analysis
and design; and * applications to bioprocesses, multivehicle
systems or energy management. The various contributions cover a
subject spectrum including inverse optimality and more modern
decentralized and cooperative formulations of receding-horizon
optimal control. Readers will find fourteen chapters dedicated to
optimization-based tools for robustness analysis, and
decision-making in relation to feedback mechanisms-fault detection,
for example-and three chapters putting forward applications where
the model-based optimization brings a novel perspective.
Developments in Model-Based Optimization and Control is a selection
of contributions expanded and updated from the Optimisation-based
Control and Estimation workshops held in November 2013 and November
2014. It forms a useful resource for academic researchers and
graduate students interested in the state of the art in predictive
control. Control engineers working in model-based optimization and
control, particularly in its bioprocess applications will also find
this collection instructive.
|
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.