![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Applied mathematics > Stochastics
The book deals with several closely related topics concerning approxima tions and perturbations of random processes and their applications to some important and fascinating classes of problems in the analysis and design of stochastic control systems and nonlinear filters. The basic mathematical methods which are used and developed are those of the theory of weak con vergence. The techniques are quite powerful for getting weak convergence or functional limit theorems for broad classes of problems and many of the techniques are new. The original need for some of the techniques which are developed here arose in connection with our study of the particular applica tions in this book, and related problems of approximation in control theory, but it will be clear that they have numerous applications elsewhere in weak convergence and process approximation theory. The book is a continuation of the author's long term interest in problems of the approximation of stochastic processes and its applications to problems arising in control and communication theory and related areas. In fact, the techniques used here can be fruitfully applied to many other areas. The basic random processes of interest can be described by solutions to either (multiple time scale) Ito differential equations driven by wide band or state dependent wide band noise or which are singularly perturbed. They might be controlled or not, and their state values might be fully observable or not (e. g., as in the nonlinear filtering problem)."
During the of Fall 1991, The Centre de Recerca Matematica, a research institute sponsored by the Institut d'Estudis Catalans, devoted a quarter to the study of stochastic analysis. Prominent workers in this field visited the Center from all over the world for periods ranging from a few days to several weeks. To take advantage of the presence in Barcelona of so many special ists in stochastic analysis, we organized a workshop on the subject in Sant Feliu de Guixols (Girona) that provided an opportunity for them to ex change information and ideas about their current work. Topics discussed included: Analysis on the Wiener space, Anticipating Stochastic Calculus and its Applications, Correlation Inequalities, Stochastic Flows, Reflected Semimartingales, and others. This volume contains a refereed selection of contributions from some of the participants in this workshop. We are deeply indebted to the authors of the articles for these exposi tions of their valuable research contributions. We also would like to thank all the referees for their helpful advice in making the volume a reflection of the dynamic interchange that characterized the workshop. The success of the Seminar was due essentially to the enthusiasm and stimulating discus sions of all the participants in an informal and pleasant atmosphere. To all of them our warm gratitude."
The theory of two-person, zero-sum differential games started at the be- ginning of the 1960s with the works of R. Isaacs in the United States and L. S. Pontryagin and his school in the former Soviet Union. Isaacs based his work on the Dynamic Programming method. He analyzed many special cases of the partial differential equation now called Hamilton- Jacobi-Isaacs-briefiy HJI-trying to solve them explicitly and synthe- sizing optimal feedbacks from the solution. He began a study of singular surfaces that was continued mainly by J. Breakwell and P. Bernhard and led to the explicit solution of some low-dimensional but highly nontriv- ial games; a recent survey of this theory can be found in the book by J. Lewin entitled Differential Games (Springer, 1994). Since the early stages of the theory, several authors worked on making the notion of value of a differential game precise and providing a rigorous derivation of the HJI equation, which does not have a classical solution in most cases; we mention here the works of W. Fleming, A. Friedman (see his book, Differential Games, Wiley, 1971), P. P. Varaiya, E. Roxin, R. J. Elliott and N. J. Kalton, N. N. Krasovskii, and A. I. Subbotin (see their book Po- sitional Differential Games, Nauka, 1974, and Springer, 1988), and L. D. Berkovitz. A major breakthrough was the introduction in the 1980s of two new notions of generalized solution for Hamilton-Jacobi equations, namely, viscosity solutions, by M. G. Crandall and P. -L.
The expected time of impact, also known as the mean first passage time (MFPT) to reach failure, is a critical metric in the management of natural disasters. The complexity of the dynamics governing natural disasters lead to stochastic behaviour. This book shows that state transitions of many such systems translate into random walks on their respective state spaces, biased and shaped by environmental inhomogeneity. Thus the probabilistic treatment of those random walks gives valuable insights of expected behaviour. A comprehensive case study of predicting cyclone induced flood is followed by a discussion of generic methods that predict MFPT addressing directional bias. This is followed by discussing MFPT prediction methods in systems showing network inhomogeneity. All presented methods are illustrated using real datasets of natural disasters. The book ends with a short discussion of possible future research areas introducing the problem of predicting MFPT for bush-fire propagation.
Fractional calculus is a rapidly growing field of research, at the interface between probability, differential equations, and mathematical physics. It is used to model anomalous diffusion, in which a cloud of particles spreads in a different manner than traditional diffusion. This monograph develops the basic theory of fractional calculus and anomalous diffusion, from the point of view of probability. In this book, we will see how fractional calculus and anomalous diffusion can be understood at a deep and intuitive level, using ideas from probability. It covers basic limit theorems for random variables and random vectors with heavy tails. This includes regular variation, triangular arrays, infinitely divisible laws, random walks, and stochastic process convergence in the Skorokhod topology. The basic ideas of fractional calculus and anomalous diffusion are closely connected with heavy tail limit theorems. Heavy tails are applied in finance, insurance, physics, geophysics, cell biology, ecology, medicine, and computer engineering. The goal of this book is to prepare graduate students in probability for research in the area of fractional calculus, anomalous diffusion, and heavy tails. Many interesting problems in this area remain open. This book will guide the motivated reader to understand the essential background needed to read and unerstand current research papers, and to gain the insights and techniques needed to begin making their own contributions to this rapidly growing field.
'Et mm. ..., si j'avait su comment en revenir, One service mathematics has rendered the je n'y serais point all' '' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf IIClI.t to the dusty canister labelled 'discarded non- The series is divergent; therefore we may be sense'. able to do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
Recent years have seen an explosion of new mathematical results on
learning and processing in neural networks. This body of results
rests on a breadth of mathematical background which even few
specialists possess. In a format intermediate between a textbook
and a collection of research articles, this book has been assembled
to present a sample of these results, and to fill in the necessary
background, in such areas as computability theory, computational
complexity theory, the theory of analog computation, stochastic
processes, dynamical systems, control theory, time-series analysis,
Bayesian analysis, regularization theory, information theory,
computational learning theory, and mathematical statistics.
In a family study of breast cancer, epidemiologists in Southern California increase the power for detecting a gene-environment interaction. In Gambia, a study helps a vaccination program reduce the incidence of Hepatitis B carriage. Archaeologists in Austria place a Bronze Age site in its true temporal location on the calendar scale. And in France, researchers map a rare disease with relatively little variation.
'Et moi, ..., si j' avait su comment en revenir, One service mathematics has rendered the human race. It has put common sense back je n'y serais point aIle.' Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non- The series is divergent; therefore we may be sense'" able 10 do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound_ Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
This book accounts in 5 independent parts, recent main developments of Stochastic Analysis: Gross-Stroock Sobolev space over a Gaussian probability space; quasi-sure analysis; anticipate stochastic integrals as divergence operators; principle of transfer from ordinary differential equations to stochastic differential equations; Malliavin calculus and elliptic estimates; stochastic Analysis in infinite dimension.
This book is based on research that, to a large extent, started around 1990, when a research project on fluid flow in stochastic reservoirs was initiated by a group including some of us with the support of VISTA, a research coopera tion between the Norwegian Academy of Science and Letters and Den norske stats oljeselskap A.S. (Statoil). The purpose of the project was to use stochastic partial differential equations (SPDEs) to describe the flow of fluid in a medium where some of the parameters, e.g., the permeability, were stochastic or "noisy." We soon realized that the theory of SPDEs at the time was insufficient to handle such equations. Therefore it became our aim to develop a new mathematically rigorous theory that satisfied the following conditions. 1) The theory should be physically meaningful and realistic, and the corre sponding solutions should make sense physically and should be useful in applications. 2) The theory should be general enough to handle many of the interesting SPDEs that occur in reservoir theory and related areas. 3) The theory should be strong and efficient enough to allow us to solve th, se SPDEs explicitly, or at least provide algorithms or approximations for the solutions."
Stochastic elasticity is a fast developing field that combines nonlinear elasticity and stochastic theories in order to significantly improve model predictions by accounting for uncertainties in the mechanical responses of materials. However, in contrast to the tremendous development of computational methods for large-scale problems, which have been proposed and implemented extensively in recent years, at the fundamental level, there is very little understanding of the uncertainties in the behaviour of elastic materials under large strains. Based on the idea that every large-scale problem starts as a small-scale data problem, this book combines fundamental aspects of finite (large-strain) elasticity and probability theories, which are prerequisites for the quantification of uncertainties in the elastic responses of soft materials. The problems treated in this book are drawn from the analytical continuum mechanics literature and incorporate random variables as basic concepts along with mechanical stresses and strains. Such problems are interesting in their own right but they are also meant to inspire further thinking about how stochastic extensions can be formulated before they can be applied to more complex physical systems.
The theory of probability began in the seventeenth century with attempts to calculate the odds of winning in certain games of chance. However, it was not until the middle of the twentieth century that mathematicians de veloped general techniques for maximizing the chances of beating a casino or winning against an intelligent opponent. These methods of finding op timal strategies for a player are at the heart of the modern theories of stochastic control and stochastic games. There are numerous applications to engineering and the social sciences, but the liveliest intuition still comes from gambling. The now classic work How to Gamble If You Must: Inequalities for Stochastic Processes by Dubins and Savage (1965) uses gambling termi nology and examples to develop an elegant, deep, and quite general theory of discrete-time stochastic control. A gambler "controls" the stochastic pro cess of his or her successive fortunes by choosing which games to play and what bets to make."
This book focuses on the class of large-scale stochastic systems, which has dominated the attention of many academic and research groups. It discusses distributed-sensor networks, decentralized detection theory, and econometric models with integrated and decentralized policymakers.
The 1991 Seminar on Stochastic Processes was held at the University of California, Los Angeles, from March 23 through March 25, 1991. This was the eleventh in a series of annual meetings which provide researchers with the opportunity to discuss current work on stochastic processes in an informal and enjoyable atmosphere. Previous seminars were held at Northwestern University, Princeton University, the University of Florida, the University of Virginia, the University of California, San Diego, and the University of British Columbia. Following the successful format of previous years there were five invited lectures. These were given by M. Barlow, G. Lawler, P. March, D. Stroock, M. Talagrand. The enthusiasm and interest of the participants created a lively and stimulating atmosphere for the seminar. Some of the topics discussed are represented by the articles in this volume. P. J. Fitzsimmons T. M. Liggett S. C. Port Los Angeles, 1991 In Memory of Steven Orey M. CRANSTON The mathematical community has lost a cherished colleague with the passing of Steven Orey. This unique and thoughtful man has left those who knew him with many pleasant memories. He has also left us with important contributions in the development of the theory of Markov processes. As a friend and former student, I wish to take this chance to recall to those who know and introduce to those who do not a portion of his lifework.
Claims reserving is central to the insurance industry. Insurance
liabilities depend on a number of different risk factors which need
to be predicted accurately. This prediction of risk factors and
outstanding loss liabilities is the core for pricing insurance
products, determining the profitability of an insurance company and
for considering the financial strength (solvency) of the company.
The maximum principle and dynamic programming are the two most commonly used approaches in solving optimal control problems. These approaches have been developed independently. The theme of this book is to unify these two approaches, and to demonstrate that the viscosity solution theory provides the framework to unify them.
Stochastic Processes: General Theory starts with the fundamental existence theorem of Kolmogorov, together with several of its extensions to stochastic processes. It treats the function theoretical aspects of processes and includes an extended account of martingales and their generalizations. Various compositions of (quasi- or semi-)martingales and their integrals are given. Here the Bochner boundedness principle plays a unifying role: a unique feature of the book. Applications to higher order stochastic differential equations and their special features are presented in detail. Stochastic processes in a manifold and multiparameter stochastic analysis are also discussed. Each of the seven chapters includes complements, exercises and extensive references: many avenues of research are suggested. The book is a completely revised and enlarged version of the author's Stochastic Processes and Integration (Noordhoff, 1979). The new title reflects the content and generality of the extensive amount of new material. Audience: Suitable as a text/reference for second year graduate classes and seminars. A knowledge of real analysis, including Lebesgue integration, is a prerequisite.
The fundamental question of characterizing continuity and boundedness of Gaussian processes goes back to Kolmogorov. After contributions by R. Dudley and X. Fernique, it was solved by the author. This book provides an overview of "generic chaining," a completely natural variation on the ideas of Kolmogorov. It takes the reader from the first principles to the edge of current knowledge and to the open problems that remain in this domain.
This monograph on fast stochastic simulation deals with methods of adaptive importance sampling (IS). The concept of IS is introduced and described in detail with several numerical examples in the context of rare event simulation. Adaptive simulation and system parameter optimization to achieve specified performance criteria are described. The techniques are applied to the analysis and design of radar CFAR (constant false alarm rate) detectors. Development of robust detection algorithms using ensemble - or E-CFAR processing is described. A second application treats the performance evaluation and parameter optimization of digital communication systems that cannot be handled analytically or even by using standard numerical techniques.
In the last five years or so there has been an important renaissance in the area of (mathematical) modeling, identification and (stochastic) control. It was the purpose of the Advanced Study Institute of which the present volume constitutes the proceedings to review recent developments in this area with par ticular emphasis on identification and filtering and to do so in such a manner that the material is accessible to a wide variety of both embryo scientists and the various breeds of established researchers to whom identification, filtering, etc. are important (such as control engineers, time series analysts, econometricians, probabilists, mathematical geologists, and various kinds of pure and applied mathematicians; all of these were represented at the ASI). For these proceedings we have taken particular care to see to it that the material presented will be understandable for a quite diverse audience. To that end we have added a fifth tutorial section (besides the four presented at the meeting) and have also included an extensive introduction which explains in detail the main problem areas and themes of these proceedings and which outlines how the various contributions fit together to form a coherent, integrated whole. The prerequisites needed to understand the material in this volume are modest and most graduate students in e. g. mathematical systems theory, applied mathematics, econo metrics or control engineering will qualify."
The articles in this volume present the state of the art in a variety of areas of discrete probability, including random walks on finite and infinite graphs, random trees, renewal sequences, Stein's method for normal approximation and Kohonen-type self-organizing maps. This volume also focuses on discrete probability and its connections with the theory of algorithms. Classical topics in discrete mathematics are represented as are expositions that condense and make readable some recent work on Markov chains, potential theory and the second moment method. This volume is suitable for mathematicians and students.
This is a companion book to Asymptotic Analysis of Random Walks: Heavy-Tailed Distributions by A.A. Borovkov and K.A. Borovkov. Its self-contained systematic exposition provides a highly useful resource for academic researchers and professionals interested in applications of probability in statistics, ruin theory, and queuing theory. The large deviation principle for random walks was first established by the author in 1967, under the restrictive condition that the distribution tails decay faster than exponentially. (A close assertion was proved by S.R.S. Varadhan in 1966, but only in a rather special case.) Since then, the principle has always been treated in the literature only under this condition. Recently, the author jointly with A.A. Mogul'skii removed this restriction, finding a natural metric for which the large deviation principle for random walks holds without any conditions. This new version is presented in the book, as well as a new approach to studying large deviations in boundary crossing problems. Many results presented in the book, obtained by the author himself or jointly with co-authors, are appearing in a monograph for the first time.
This book presents in thirteen refereed survey articles an overview of modern activity in stochastic analysis, written by leading international experts. The topics addressed include stochastic fluid dynamics and regularization by noise of deterministic dynamical systems; stochastic partial differential equations driven by Gaussian or Levy noise, including the relationship between parabolic equations and particle systems, and wave equations in a geometric framework; Malliavin calculus and applications to stochastic numerics; stochastic integration in Banach spaces; porous media-type equations; stochastic deformations of classical mechanics and Feynman integrals and stochastic differential equations with reflection. The articles are based on short courses given at the Centre Interfacultaire Bernoulli of the Ecole Polytechnique Federale de Lausanne, Switzerland, from January to June 2012. They offer a valuable resource not only for specialists, but also for other researchers and Ph.D. students in the fields of stochastic analysis and mathematical physics. Contributors: S. Albeverio M. Arnaudon V. Bally V. Barbu H. Bessaih Z. Brzezniak K. Burdzy A.B. Cruzeiro F. Flandoli A. Kohatsu-Higa S. Mazzucchi C. Mueller J. van Neerven M. Ondrejat S. Peszat M. Veraar L. Weis J.-C. Zambrini
Traditionally, randomness and determinism have been viewed as being
diametrically opposed, based on the idea that causality and
determinism is complicated by "noise." Although recent research has
suggested that noise can have a productive role, it still views
noise as a separate entity. This work suggests that this not need
to be so. In an informal presentation, instead, the problem is
traced to traditional assumptions regarding dynamical equations and
their need for unique solutions. If this requirement is relaxed,
the equations admit for instability and stochasticity evolving from
the dynamics itself. This allows for a decoupling from the "burden"
of the past and provides insights into concepts such as
predictability, irreversibility, adaptability, creativity and
multi-choice behaviour. This reformulation is especially relevant
for biological and social sciences whose need for flexibility a
propos of environmental demands is important to understand: this
suggests that many system models are based on randomness and
nondeterminism complicated with a little bit of determinism to
ultimately achieve concurrent flexibility and stability. As a
result, the statistical perception of reality is seen as being a
more productive tool than classical determinism. The book addresses
scientists of all disciplines, with special emphasis at making the
ideas more accessible to scientists and students not traditionally
involved in the formal mathematics of the physical sciences. The
implications may be of interest also to specialists in the
philosophy of science. |
You may like...
|