![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Applied mathematics > Stochastics
Principles and Methods for Data Science, Volume 43 in the Handbook of Statistics series, highlights new advances in the field, with this updated volume presenting interesting and timely topics, including Competing risks, aims and methods, Data analysis and mining of microbial community dynamics, Support Vector Machines, a robust prediction method with applications in bioinformatics, Bayesian Model Selection for Data with High Dimension, High dimensional statistical inference: theoretical development to data analytics, Big data challenges in genomics, Analysis of microarray gene expression data using information theory and stochastic algorithm, Hybrid Models, Markov Chain Monte Carlo Methods: Theory and Practice, and more.
Highlighting modern computational methods, Applied Stochastic Modelling, Second Edition provides students with the practical experience of scientific computing in applied statistics through a range of interesting real-world applications. It also successfully revises standard probability and statistical theory. Along with an updated bibliography and improved figures, this edition offers numerous updates throughout. New to the Second Edition An extended discussion on Bayesian methods A large number of new exercises A new appendix on computational methods The book covers both contemporary and classical aspects of statistics, including survival analysis, Kernel density estimation, Markov chain Monte Carlo, hypothesis testing, regression, bootstrap, and generalised linear models. Although the book can be used without reference to computational programs, the author provides the option of using powerful computational tools for stochastic modelling. All of the data sets and MATLAB (R) and R programs found in the text as well as lecture slides and other ancillary material are available for download at www.crcpress.com Continuing in the bestselling tradition of its predecessor, this textbook remains an excellent resource for teaching students how to fit stochastic models to data.
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
Products of Random Variables explores the theory of products of random variables through from distributions and limit theorems, to characterizations, to applications in physics, order statistics, and number theory. It uses entirely probabilistic arguments in actualizing the potential of the asymptotic theory of products of independent random variables and obtaining results with dependent variables using a new Bonferroni-type argument. Systematically and comprehensively tracks the progression of research completed in the area over the last twenty years. Well-indexed and well-referenced, Products of Random Variables -Clarifies foundational concepts such as symmetric and limiting distributions of products -Examines various limit theorems, from logarithmically Poisson distributions to triangular arrays -Explores characterization theorems, detailing normal, Cauchy, and bivariate distributions -Describes models of interactive particles -Elucidates dual systems of interactive particles, dual systems of increasing size, and random walks -Covers the Kubilius-Turan inequality and distributions for multiplicative functions -Probes sequences of prime divisors and prime numbers -Discusses Markov chains, Hilbert spaces, and quotients of random variables -Presents income growth models and numerous other applied models tapping products of random variables Authored by eminent scholars in the field, this volume is an important research reference for applied mathematicians, statisticians, physicists, and graduate students in these disciplines.
This book studies the large deviations for empirical measures and vector-valued additive functionals of Markov chains with general state space. Under suitable recurrence conditions, the ergodic theorem for additive functionals of a Markov chain asserts the almost sure convergence of the averages of a real or vector-valued function of the chain to the mean of the function with respect to the invariant distribution. In the case of empirical measures, the ergodic theorem states the almost sure convergence in a suitable sense to the invariant distribution. The large deviation theorems provide precise asymptotic estimates at logarithmic level of the probabilities of deviating from the preponderant behavior asserted by the ergodic theorems.
This book presents the probabilistic methods around Hardy martingales for an audience interested in their applications to complex, harmonic, and functional analysis. Building on work of Bourgain, Garling, Jones, Maurey, Pisier, and Varopoulos, it discusses in detail those martingale spaces that reflect characteristic qualities of complex analytic functions. Its particular themes are holomorphic random variables on Wiener space, and Hardy martingales on the infinite torus product, and numerous deep applications to the geometry and classification of complex Banach spaces, e.g., the SL estimates for Doob's projection operator, the embedding of L1 into L1/H1, the isomorphic classification theorem for the polydisk algebras, or the real variables characterization of Banach spaces with the analytic Radon Nikodym property. Due to the inclusion of key background material on stochastic analysis and Banach space theory, it's suitable for a wide spectrum of researchers and graduate students working in classical and functional analysis.
Compound renewal processes (CRPs) are among the most ubiquitous models arising in applications of probability. At the same time, they are a natural generalization of random walks, the most well-studied classical objects in probability theory. This monograph, written for researchers and graduate students, presents the general asymptotic theory and generalizes many well-known results concerning random walks. The book contains the key limit theorems for CRPs, functional limit theorems, integro-local limit theorems, large and moderately large deviation principles for CRPs in the state space and in the space of trajectories, including large deviation principles in boundary crossing problems for CRPs, with an explicit form of the rate functionals, and an extension of the invariance principle for CRPs to the domain of moderately large and small deviations. Applications establish the key limit laws for Markov additive processes, including limit theorems in the domains of normal and large deviations.
Statistical inference carries great significance in model building from both the theoretical and the applications points of view. Its applications to engineering and economic systems, financial economics, and the biological and medical sciences have made statistical inference for stochastic processes a well-recognized and important branch of statistics and probability.
This book provides a unified approach for the study of constrained Markov decision processes with a finite state space and unbounded costs. Unlike the single controller case considered in many other books, the author considers a single controller with several objectives, such as minimizing delays and loss, probabilities, and maximization of throughputs. It is desirable to design a controller that minimizes one cost objective, subject to inequality constraints on other cost objectives. This framework describes dynamic decision problems arising frequently in many engineering fields. A thorough overview of these applications is presented in the introduction.
Optimal Control and Optimization of Stochastic Supply Chain Systems examines its subject the context of the presence of a variety of uncertainties. Numerous examples with intuitive illustrations and tables are provided, to demonstrate the structural characteristics of the optimal control policies in various stochastic supply chains and to show how to make use of these characteristics to construct easy-to-operate sub-optimal policies. In Part I, a general introduction to stochastic supply chain systems is provided. Analytical models for various stochastic supply chain systems are formulated and analysed in Part II. In Part III the structural knowledge of the optimal control policies obtained in Part II is utilized to construct easy-to-operate sub-optimal control policies for various stochastic supply chain systems accordingly. Finally, Part IV discusses the optimisation of threshold-type control policies and their robustness. A key feature of the book is its tying together of the complex analytical models produced by the requirements of operational practice, and the simple solutions needed for implementation. The analytical models and theoretical analysis propounded in this monograph will be of benefit to academic researchers and graduate students looking at logistics and supply chain management from standpoints in operations research or industrial, manufacturing, or control engineering. The practical tools and solutions and the qualitative insights into the ideas underlying functional supply chain systems will be of similar use to readers from more industrially-based backgrounds.
The seminar on Stochastic Analysis and Mathematical Physics started in 1984 at the Catholic University of Chile in Santiago and has been an on going research activity. Since 1995, the group has organized international workshops as a way of promoting a broader dialogue among experts in the areas of classical and quantum stochastic analysis, mathematical physics and physics. This volume, consisting primarily of contributions to the Third Inter national Workshop on Stochastic Analysis and Mathematical Physics (in Spanish ANESTOC), held in Santiago, Chile, in October 1998, focuses on an analysis of quantum dynamics and related problems in probability the ory. Various articles investigate quantum dynamical semigroups and new results on q-deformed oscillator algebras, while others examine the appli cation of classical stochastic processes in quantum modeling. As in previous workshops, the topic of quantum flows and semigroups occupied an important place. In her paper, R. Carbone uses a spectral type analysis to obtain exponential rates of convergence towards the equilibrium of a quantum dynamical semigroup in the GBP2 sense. The method is illus trated with a quantum extension of a classical birth and death process. Quantum extensions of classical Markov processes lead to subtle problems of domains. This is in particular illustrated by F. Fagnola, who presents a pathological example of a semigroup for which the largest * -subalgebra (of the von Neumann algebra of bounded linear operators of GBP2 (lR+, IC)), con tained in the domain of its infinitesimal generator, is not a-weakly dense.
Presenting statistical and stochastic methods for the analysis and design of technological systems in engineering and applied areas, this work documents developments in statistical modelling, identification, estimation and signal processing. The book covers such topics as subspace methods, stochastic realization, state space modelling, and identification and parameter estimation.
This book deals with certain important problems in Classical and Quantum Information Theory Quantum Information Theory, A Selection of Matrix Inequalities Stochastic Filtering Theory Applied to Electromagnetic Fields and Strings Wigner-distributions in Quantum Mechanics Quantization of Classical Field Theories Statistical Signal Processing Quantum Field Theory, Quantum Statistics, Gravity, Stochastic Fields and Information Problems in Information Theory It will be very helpful for students of Undergraduate and Postgraduate Courses in Electronics, Communication and Signal Processing. Print edition not for sale in South Asia (India, Sri Lanka, Nepal, Bangladesh, Pakistan or Bhutan).
This book presents a radically new approach to problems of evaluating and optimizing the performance of continuous-time stochastic systems. This approach is based on the use of a family of Markov processes called Piecewise-Deterministic Processes (PDPs) as a general class of stochastic system models. A PDP is a Markov process that follows deterministic trajectories between random jumps, the latter occurring either spontaneously, in a Poisson-like fashion, or when the process hits the boundary of its state space. This formulation includes an enormous variety of applied problems in engineering, operations research, management science and economics as special cases; examples include queueing systems, stochastic scheduling, inventory control, resource allocation problems, optimal planning of production or exploitation of renewable or non-renewable resources, insurance analysis, fault detection in process systems, and tracking of maneuvering targets, among many others. The first part of the book shows how these applications lead to the PDP as a system model, and the main properties of PDPs are derived. There is particular emphasis on the so-called extended generator of the process, which gives a general method for calculating expectations and distributions of system performance functions. The second half of the book is devoted to control theory for PDPs, with a view to controlling PDP models for optimal performance: characterizations are obtained of optimal strategies both for continuously-acting controllers and for control by intervention (impulse control). Throughout the book, modern methods of stochastic analysis are used, but all the necessary theory is developed from scratch and presented in a self-contained way. The book will be useful to engineers and scientists in the application areas as well as to mathematicians interested in applications of stochastic analysis.
In pioneering work in the 1950s, S. Karlin and J. McGregor showed that probabilistic aspects of certain Markov processes can be studied by analyzing orthogonal eigenfunctions of associated operators. In the decades since, many authors have extended and deepened this surprising connection between orthogonal polynomials and stochastic processes. This book gives a comprehensive analysis of the spectral representation of the most important one-dimensional Markov processes, namely discrete-time birth-death chains, birth-death processes and diffusion processes. It brings together the main results from the extensive literature on the topic with detailed examples and applications. Also featuring an introduction to the basic theory of orthogonal polynomials and a selection of exercises at the end of each chapter, it is suitable for graduate students with a solid background in stochastic processes as well as researchers in orthogonal polynomials and special functions who want to learn about applications of their work to probability.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
This book is intended as a text covering the central concepts and techniques of Competitive Markov Decision Processes. It is an attempt to present a rig orous treatment that combines two significant research topics: Stochastic Games and Markov Decision Processes, which have been studied exten sively, and at times quite independently, by mathematicians, operations researchers, engineers, and economists. Since Markov decision processes can be viewed as a special noncompeti tive case of stochastic games, we introduce the new terminology Competi tive Markov Decision Processes that emphasizes the importance of the link between these two topics and of the properties of the underlying Markov processes. The book is designed to be used either in a classroom or for self-study by a mathematically mature reader. In the Introduction (Chapter 1) we outline a number of advanced undergraduate and graduate courses for which this book could usefully serve as a text. A characteristic feature of competitive Markov decision processes - and one that inspired our long-standing interest - is that they can serve as an "orchestra" containing the "instruments" of much of modern applied (and at times even pure) mathematics. They constitute a topic where the instruments of linear algebra, applied probability, mathematical program ming, analysis, and even algebraic geometry can be "played" sometimes solo and sometimes in harmony to produce either beautifully simple or equally beautiful, but baroque, melodies, that is, theorems."
The main subject of this introductory book is simple random walk on the integer lattice, with special attention to the two-dimensional case. This fascinating mathematical object is the point of departure for an intuitive and richly illustrated tour of related topics at the active edge of research. It starts with three different proofs of the recurrence of the two-dimensional walk, via direct combinatorial arguments, electrical networks, and Lyapunov functions. After reviewing some relevant potential-theoretic tools, the reader is guided toward the relatively new topic of random interlacements - which can be viewed as a 'canonical soup' of nearest-neighbour loops through infinity - again with emphasis on two dimensions. On the way, readers will visit conditioned simple random walks - which are the 'noodles' in the soup - and also discover how Poisson processes of infinite objects are constructed and review the recently introduced method of soft local times. Each chapter ends with many exercises, making it suitable for courses and independent study.
The main subject of this introductory book is simple random walk on the integer lattice, with special attention to the two-dimensional case. This fascinating mathematical object is the point of departure for an intuitive and richly illustrated tour of related topics at the active edge of research. It starts with three different proofs of the recurrence of the two-dimensional walk, via direct combinatorial arguments, electrical networks, and Lyapunov functions. After reviewing some relevant potential-theoretic tools, the reader is guided toward the relatively new topic of random interlacements - which can be viewed as a 'canonical soup' of nearest-neighbour loops through infinity - again with emphasis on two dimensions. On the way, readers will visit conditioned simple random walks - which are the 'noodles' in the soup - and also discover how Poisson processes of infinite objects are constructed and review the recently introduced method of soft local times. Each chapter ends with many exercises, making it suitable for courses and independent study.
This book aims to position itself between the level of elementary probability texts and advanced works on stochastic processes. The pre-requisites to consult this book are a course on elementary probability theory and statistics, and a course on advanced calculus. In this book numberous examples have been given, based on theories discussed and a large number of problems along with their answers have also been provided. This revised edition further updates the materials and references and some new chapters have been introduced. The text has been designed particularly for advanced undergraduate, postgraduate and research level courses in applied maths, statistics, operations research, computer science, different branches of engineering, telecommunications, business and management, economics and life sciences.
This unique two-volume set presents the subjects of stochastic processes, information theory, and Lie groups in a unified setting, thereby building bridges between fields that are rarely studied by the same individuals. Unlike the many excellent formal treatments available for each of these subjects individually, the emphasis in both of these volumes is on the use of stochastic, geometric, and group-theoretic concepts in the modeling of physical phenomena. Stochastic Models, Information Theory, and Lie Groups will be of interest to advanced undergraduate and graduate students, researchers, and practitioners working in applied mathematics, the physical sciences, and engineering. Extensive exercises and motivating examples make the work suitable as a textbook for use in courses that emphasize applied stochastic processes or differential geometry.
Probability theory has diverse applications in a plethora of fields, including physics, engineering, computer science, chemistry, biology and economics. This book will familiarize students with various applications of probability theory, stochastic modeling and random processes, using examples from all these disciplines and more. The reader learns via case studies and begins to recognize the sort of problems that are best tackled probabilistically. The emphasis is on conceptual understanding, the development of intuition and gaining insight, keeping technicalities to a minimum. Nevertheless, a glimpse into the depth of the topics is provided, preparing students for more specialized texts while assuming only an undergraduate-level background in mathematics. The wide range of areas covered - never before discussed together in a unified fashion - includes Markov processes and random walks, Langevin and Fokker-Planck equations, noise, generalized central limit theorem and extreme values statistics, random matrix theory and percolation theory.
This book provides an essential introduction to Stochastic Programming, especially intended for graduate students. The book begins by exploring a linear programming problem with random parameters, representing a decision problem under uncertainty. Several models for this problem are presented, including the main ones used in Stochastic Programming: recourse models and chance constraint models. The book not only discusses the theoretical properties of these models and algorithms for solving them, but also explains the intrinsic differences between the models. In the book's closing section, several case studies are presented, helping students apply the theory covered to practical problems. The book is based on lecture notes developed for an Econometrics and Operations Research course for master students at the University of Groningen, the Netherlands - the longest-standing Stochastic Programming course worldwide.
Elementary treatments of Markov chains, especially those devoted to discrete-time and finite state-space theory, leave the impression that everything is smooth and easy to understand. This exposition of the works of Kolmogorov, Feller, Chung, Kato, and other mathematical luminaries, which focuses on time-continuous chains but is not so far from being elementary itself, reminds us again that the impression is false: an infinite, but denumerable, state-space is where the fun begins. If you have not heard of Blackwell's example (in which all states are instantaneous), do not understand what the minimal process is, or do not know what happens after explosion, dive right in. But beware lest you are enchanted: 'There are more spells than your commonplace magicians ever dreamed of.' |
You may like...
Stochastic Analysis of Mixed Fractional…
Yuliya Mishura, Mounir Zili
Hardcover
Deep Learning, Volume 48
Arni S.R. Srinivasa Rao, Venu Govindaraju, …
Hardcover
R6,171
Discovery Miles 61 710
Control and Filtering for Semi-Markovian…
Fanbiao Li, Peng Shi, …
Hardcover
R3,316
Discovery Miles 33 160
Ruin Probabilities - Smoothness, Bounds…
Yuliya Mishura, Olena Ragulina
Hardcover
R3,086
Discovery Miles 30 860
Stochastic Processes and Their…
Christo Ananth, N. Anbazhagan, …
Hardcover
R6,687
Discovery Miles 66 870
Stochastic Processes - Estimation…
Kaddour Najim, Enso Ikonen, …
Hardcover
R4,310
Discovery Miles 43 100
Information Geometry, Volume 45
Arni S.R. Srinivasa Rao, C.R. Rao, …
Hardcover
R6,201
Discovery Miles 62 010
|