Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 6 of 6 matches in All Departments
This volume of the Encyclopaedia is a survey of stochastic calculus, an increasingly important part of probability, authored by well-known experts in the field. The book addresses graduate students and researchers in probability theory and mathematical statistics, as well as physicists and engineers who need to apply stochastic methods.
Stochastic control theory is a relatively young branch of mathematics. The beginning of its intensive development falls in the late 1950s and early 1960s. During that period an extensive literature appeared on optimal stochastic control using the quadratic performance criterion (see references in W onham [76J). At the same time, Girsanov [25J and Howard [26J made the first steps in constructing a general theory, based on Bellman's technique of dynamic programming, developed by him somewhat earlier [4J. Two types of engineering problems engendered two different parts of stochastic control theory. Problems of the first type are associated with multistep decision making in discrete time, and are treated in the theory of discrete stochastic dynamic programming. For more on this theory, we note in addition to the work of Howard and Bellman, mentioned above, the books by Derman [8J, Mine and Osaki [55J, and Dynkin and Yushkevich [12]. Another class of engineering problems which encouraged the development of the theory of stochastic control involves time continuous control of a dynamic system in the presence of random noise. The case where the system is described by a differential equation and the noise is modeled as a time continuous random process is the core of the optimal control theory of diffusion processes. This book deals with this latter theory.
Stochastic control theory is a relatively young branch of mathematics. The beginning of its intensive development falls in the late 1950s and early 1960s. During that period an extensive literature appeared on optimal stochastic control using the quadratic performance criterion (see references in W onham [76J). At the same time, Girsanov [25J and Howard [26J made the first steps in constructing a general theory, based on Bellman's technique of dynamic programming, developed by him somewhat earlier [4J. Two types of engineering problems engendered two different parts of stochastic control theory. Problems of the first type are associated with multistep decision making in discrete time, and are treated in the theory of discrete stochastic dynamic programming. For more on this theory, we note in addition to the work of Howard and Bellman, mentioned above, the books by Derman [8J, Mine and Osaki [55J, and Dynkin and Yushkevich [12]. Another class of engineering problems which encouraged the development of the theory of stochastic control involves time continuous control of a dynamic system in the presence of random noise. The case where the system is described by a differential equation and the noise is modeled as a time continuous random process is the core of the optimal control theory of diffusion processes. This book deals with this latter theory.
This volume of the Encyclopaedia is a survey of stochastic calculus, an increasingly important part of probability, authored by well-known experts in the field. The book addresses graduate students and researchers in probability theory and mathematical statistics, as well as physicists and engineers who need to apply stochastic methods.
Kolmogorov equations are second order parabolic equations with a finite or an infinite number of variables. They are deeply connected with stochastic differential equations in finite or infinite dimensional spaces. They arise in many fields as Mathematical Physics, Chemistry and Mathematical Finance. These equations can be studied both by probabilistic and by analytic methods, using such tools as Gaussian measures, Dirichlet Forms, and stochastic calculus. The following courses have been delivered: N.V. Krylov presented Kolmogorov equations coming from finite-dimensional equations, giving existence, uniqueness and regularity results. M. Röckner has presented an approach to Kolmogorov equations in infinite dimensions, based on an LP-analysis of the corresponding diffusion operators with respect to suitably chosen measures. J. Zabczyk started from classical results of L. Gross, on the heat equation in infinite dimension, and discussed some recent results.
Stochastic control theory is a relatively young branch of mathematics. The beginning of its intensive development falls in the late 1950s and early 1960s. ~urin~ that period an extensive literature appeared on optimal stochastic control using the quadratic performance criterion (see references in Wonham [76]). At the same time, Girsanov [25] and Howard [26] made the first steps in constructing a general theory, based on Bellman's technique of dynamic programming, developed by him somewhat earlier [4]. Two types of engineering problems engendered two different parts of stochastic control theory. Problems of the first type are associated with multistep decision making in discrete time, and are treated in the theory of discrete stochastic dynamic programming. For more on this theory, we note in addition to the work of Howard and Bellman, mentioned above, the books by Derman [8], Mine and Osaki [55], and Dynkin and Yushkevich [12]. Another class of engineering problems which encouraged the development of the theory of stochastic control involves time continuous control of a dynamic system in the presence of random noise. The case where the system is described by a differential equation and the noise is modeled as a time continuous random process is the core of the optimal control theory of diffusion processes. This book deals with this latter theory.
|
You may like...
The South African Keto & Intermittent…
Rita Venter, Natalie Lawson
Paperback
|