0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (5)
  • R2,500 - R5,000 (2)
  • R5,000 - R10,000 (3)
  • -
Status
Brand

Showing 1 - 10 of 10 matches in All Departments

Stochastic Controls - Hamiltonian Systems and HJB Equations (Hardcover, 1999 ed.): Jiongmin Yong, Xun Yu Zhou Stochastic Controls - Hamiltonian Systems and HJB Equations (Hardcover, 1999 ed.)
Jiongmin Yong, Xun Yu Zhou
R5,001 Discovery Miles 50 010 Ships in 12 - 17 working days

The maximum principle and dynamic programming are the two most commonly used approaches in solving optimal control problems. These approaches have been developed independently. The theme of this book is to unify these two approaches, and to demonstrate that the viscosity solution theory provides the framework to unify them.

Optimal Control Theory for Infinite Dimensional Systems (Hardcover, 1995 ed.): Xungjing Li, Jiongmin Yong Optimal Control Theory for Infinite Dimensional Systems (Hardcover, 1995 ed.)
Xungjing Li, Jiongmin Yong
R5,963 Discovery Miles 59 630 Ships in 12 - 17 working days

Infinite dimensional systems can be used to describe many phenomena in the real world. As is well known, heat conduction, properties of elastic plastic material, fluid dynamics, diffusion-reaction processes, etc., all lie within this area. The object that we are studying (temperature, displace ment, concentration, velocity, etc.) is usually referred to as the state. We are interested in the case where the state satisfies proper differential equa tions that are derived from certain physical laws, such as Newton's law, Fourier's law etc. The space in which the state exists is called the state space, and the equation that the state satisfies is called the state equation. By an infinite dimensional system we mean one whose corresponding state space is infinite dimensional. In particular, we are interested in the case where the state equation is one of the following types: partial differential equation, functional differential equation, integro-differential equation, or abstract evolution equation. The case in which the state equation is being a stochastic differential equation is also an infinite dimensional problem, but we will not discuss such a case in this book."

Mathematical Analysis: A Concise Introduction (Hardcover): Jiongmin Yong Mathematical Analysis: A Concise Introduction (Hardcover)
Jiongmin Yong
R2,038 Discovery Miles 20 380 Ships in 10 - 15 working days

Mathematical analysis serves as a common foundation for many research areas of pure and applied mathematics. It is also an important and powerful tool used in many other fields of science, including physics, chemistry, biology, engineering, finance, and economics. In this book, some basic theories of analysis are presented, including metric spaces and their properties, limit of sequences, continuous function, differentiation, Riemann integral, uniform convergence, and series.After going through a sequence of courses on basic calculus and linear algebra, it is desirable for one to spend a reasonable length of time (ideally, say, one semester) to build an advanced base of analysis sufficient for getting into various research fields other than analysis itself, and/or stepping into more advanced levels of analysis courses (such as real analysis, complex analysis, differential equations, functional analysis, stochastic analysis, amongst others). This book is written to meet such a demand. Readers will find the treatment of the material is as concise as possible, but still maintaining all the necessary details.

Optimization Theory: A Concise Introduction (Hardcover): Jiongmin Yong Optimization Theory: A Concise Introduction (Hardcover)
Jiongmin Yong
R2,039 Discovery Miles 20 390 Ships in 10 - 15 working days

Mathematically, most of the interesting optimization problems can be formulated to optimize some objective function, subject to some equality and/or inequality constraints. This book introduces some classical and basic results of optimization theory, including nonlinear programming with Lagrange multiplier method, the Karush-Kuhn-Tucker method, Fritz John's method, problems with convex or quasi-convex constraints, and linear programming with geometric method and simplex method.A slim book such as this which touches on major aspects of optimization theory will be very much needed for most readers. We present nonlinear programming, convex programming, and linear programming in a self-contained manner. This book is for a one-semester course for upper level undergraduate students or first/second year graduate students. It should also be useful for researchers working on many interdisciplinary areas other than optimization.

Differential Games: A Concise Introduction (Hardcover): Jiongmin Yong Differential Games: A Concise Introduction (Hardcover)
Jiongmin Yong
R2,769 Discovery Miles 27 690 Ships in 12 - 17 working days

This book uses a small volume to present the most basic results for deterministic two-person differential games. The presentation begins with optimization of a single function, followed by a basic theory for two-person games. For dynamic situations, the author first recalls control theory which is treated as single-person differential games. Then a systematic theory of two-person differential games is concisely presented, including evasion and pursuit problems, zero-sum problems and LQ differential games.The book is intended to be self-contained, assuming that the readers have basic knowledge of calculus, linear algebra, and elementary ordinary differential equations. The readership of the book could be junior/senior undergraduate and graduate students with majors related to applied mathematics, who are interested in differential games. Researchers in some other related areas, such as engineering, social science, etc. will also find the book useful.

Stochastic Controls - Hamiltonian Systems and HJB Equations (Paperback, Softcover reprint of the original 1st ed. 1999):... Stochastic Controls - Hamiltonian Systems and HJB Equations (Paperback, Softcover reprint of the original 1st ed. 1999)
Jiongmin Yong, Xun Yu Zhou
R4,997 Discovery Miles 49 970 Ships in 10 - 15 working days

As is well known, Pontryagin's maximum principle and Bellman's dynamic programming are the two principal and most commonly used approaches in solving stochastic optimal control problems. * An interesting phenomenon one can observe from the literature is that these two approaches have been developed separately and independently. Since both methods are used to investigate the same problems, a natural question one will ask is the fol lowing: (Q) What is the relationship betwccn the maximum principlc and dy namic programming in stochastic optimal controls? There did exist some researches (prior to the 1980s) on the relationship between these two. Nevertheless, the results usually werestated in heuristic terms and proved under rather restrictive assumptions, which were not satisfied in most cases. In the statement of a Pontryagin-type maximum principle there is an adjoint equation, which is an ordinary differential equation (ODE) in the (finite-dimensional) deterministic case and a stochastic differential equation (SDE) in the stochastic case. The system consisting of the adjoint equa tion, the original state equation, and the maximum condition is referred to as an (extended) Hamiltonian system. On the other hand, in Bellman's dynamic programming, there is a partial differential equation (PDE), of first order in the (finite-dimensional) deterministic case and of second or der in the stochastic case. This is known as a Hamilton-Jacobi-Bellman (HJB) equation.

Optimal Control Theory for Infinite Dimensional Systems (Paperback, Softcover reprint of the original 1st ed. 1995): Xungjing... Optimal Control Theory for Infinite Dimensional Systems (Paperback, Softcover reprint of the original 1st ed. 1995)
Xungjing Li, Jiongmin Yong
R6,196 Discovery Miles 61 960 Ships in 10 - 15 working days

Infinite dimensional systems can be used to describe many phenomena in the real world. As is well known, heat conduction, properties of elastic plastic material, fluid dynamics, diffusion-reaction processes, etc., all lie within this area. The object that we are studying (temperature, displace ment, concentration, velocity, etc.) is usually referred to as the state. We are interested in the case where the state satisfies proper differential equa tions that are derived from certain physical laws, such as Newton's law, Fourier's law etc. The space in which the state exists is called the state space, and the equation that the state satisfies is called the state equation. By an infinite dimensional system we mean one whose corresponding state space is infinite dimensional. In particular, we are interested in the case where the state equation is one of the following types: partial differential equation, functional differential equation, integro-differential equation, or abstract evolution equation. The case in which the state equation is being a stochastic differential equation is also an infinite dimensional problem, but we will not discuss such a case in this book."

Control Theory of Distributed Parameter Systems and Applications - Proceedings of the IFIP WG 7.2 Working Conference, Shanghai,... Control Theory of Distributed Parameter Systems and Applications - Proceedings of the IFIP WG 7.2 Working Conference, Shanghai, China, May 6-9, 1990 (Paperback)
Xunjing Li, Jiongmin Yong
R1,472 Discovery Miles 14 720 Ships in 10 - 15 working days

The IFIP-TC7, WG 7.2 Conference on Control Theory of Distributed Parameter Systems and Applications was held at Fudan University, Shanghai, China, May 6-9, 1990. The papers presented cover a wide variety of topics, e.g. the theory of identification, optimal control, stabilization, controllability, stochastic control as well as appplications in heat exchangers, elastic structures, nuclear reactor, meteorology etc.

Stochastic Linear-Quadratic Optimal Control Theory: Differential Games and Mean-Field Problems (Paperback, 1st ed. 2020):... Stochastic Linear-Quadratic Optimal Control Theory: Differential Games and Mean-Field Problems (Paperback, 1st ed. 2020)
Jingrui Sun, Jiongmin Yong
R1,829 Discovery Miles 18 290 Ships in 10 - 15 working days

This book gathers the most essential results, including recent ones, on linear-quadratic optimal control problems, which represent an important aspect of stochastic control. It presents results for two-player differential games and mean-field optimal control problems in the context of finite and infinite horizon problems, and discusses a number of new and interesting issues. Further, the book identifies, for the first time, the interconnections between the existence of open-loop and closed-loop Nash equilibria, solvability of the optimality system, and solvability of the associated Riccati equation, and also explores the open-loop solvability of mean-filed linear-quadratic optimal control problems. Although the content is largely self-contained, readers should have a basic grasp of linear algebra, functional analysis and stochastic ordinary differential equations. The book is mainly intended for senior undergraduate and graduate students majoring in applied mathematics who are interested in stochastic control theory. However, it will also appeal to researchers in other related areas, such as engineering, management, finance/economics and the social sciences.

Stochastic Linear-Quadratic Optimal Control Theory: Open-Loop and Closed-Loop Solutions (Paperback, 1st ed. 2020): Jingrui Sun,... Stochastic Linear-Quadratic Optimal Control Theory: Open-Loop and Closed-Loop Solutions (Paperback, 1st ed. 2020)
Jingrui Sun, Jiongmin Yong
R1,829 Discovery Miles 18 290 Ships in 10 - 15 working days

This book gathers the most essential results, including recent ones, on linear-quadratic optimal control problems, which represent an important aspect of stochastic control. It presents the results in the context of finite and infinite horizon problems, and discusses a number of new and interesting issues. Further, it precisely identifies, for the first time, the interconnections between three well-known, relevant issues - the existence of optimal controls, solvability of the optimality system, and solvability of the associated Riccati equation. Although the content is largely self-contained, readers should have a basic grasp of linear algebra, functional analysis and stochastic ordinary differential equations. The book is mainly intended for senior undergraduate and graduate students majoring in applied mathematics who are interested in stochastic control theory. However, it will also appeal to researchers in other related areas, such as engineering, management, finance/economics and the social sciences.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Lucky Plastic 3-in-1 Nose Ear Trimmer…
R299 R276 Discovery Miles 2 760
Multi Colour Animal Print Neckerchief
R119 Discovery Miles 1 190
Baby Dove Shampoo Rich Moisture 200ml
R50 Discovery Miles 500
Complete Adult Cat Food (3kg)
R215 R185 Discovery Miles 1 850
Kenwood Steam Iron with Auto Shut Off…
R634 Discovery Miles 6 340
Elecstor 18W In-Line UPS (Black)
R999 R404 Discovery Miles 4 040
Nope
Jordan Peele Blu-ray disc R132 Discovery Miles 1 320
The Papery A5 WOW 2025 Diary - Wolf
R349 R300 Discovery Miles 3 000
Harry Styles - (Digipak)
Harry Styles CD  (7)
R62 Discovery Miles 620
The Lord Is My Strength And My Song…
Paperback R35 R29 Discovery Miles 290

 

Partners