![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
The book is aimed at graduate students and researchers with basic knowledge of Probability and Integration Theory. It introduces classical inequalities in vector and functional spaces with applications to probability. It also develops new extensions of the analytical inequalities, with sharper bounds and generalizations to the sum or the supremum of random variables, to martingales and to transformed Brownian motions. The proofs of many new results are presented in great detail. Original tools are developed for spatial point processes and stochastic integration with respect to local martingales in the plane.This second edition covers properties of random variables and time continuous local martingales with a discontinuous predictable compensator, with exponential inequalities and new inequalities for their maximum variable and their p-variations. A chapter on stochastic calculus presents the exponential sub-martingales developed for stationary processes and their properties. Another chapter devoted itself to the renewal theory of processes and to semi-Markovian processes, branching processes and shock processes. The Chapman-Kolmogorov equations for strong semi-Markovian processes provide equations for their hitting times in a functional setting which extends the exponential properties of the Markovian processes.
This introductory book enables researchers and students of all backgrounds to compute interrater agreements for nominal data. It presents an overview of available indices, requirements, and steps to be taken in a research project with regard to reliability, preceded by agreement. The book explains the importance of computing the interrater agreement and how to calculate the corresponding indices. Furthermore, it discusses current views on chance expected agreement and problems related to different research situations, so as to help the reader consider what must be taken into account in order to achieve a proper use of the indices. The book offers a practical guide for researchers, Ph.D. and master students, including those without any previous training in statistics (such as in sociology, psychology or medicine), as well as policymakers who have to make decisions based on research outcomes in which these types of indices are used.
This is a comprehensive survey on the research on the parabolic Anderson model - the heat equation with random potential or the random walk in random potential - of the years 1990 - 2015. The investigation of this model requires a combination of tools from probability (large deviations, extreme-value theory, e.g.) and analysis (spectral theory for the Laplace operator with potential, variational analysis, e.g.). We explain the background, the applications, the questions and the connections with other models and formulate the most relevant results on the long-time behavior of the solution, like quenched and annealed asymptotics for the total mass, intermittency, confinement and concentration properties and mass flow. Furthermore, we explain the most successful proof methods and give a list of open research problems. Proofs are not detailed, but concisely outlined and commented; the formulations of some theorems are slightly simplified for better comprehension.
This book presents Markov and quantum processes as two sides of a coin called generated stochastic processes. It deals with quantum processes as reversible stochastic processes generated by one-step unitary operators, while Markov processes are irreversible stochastic processes generated by one-step stochastic operators. The characteristic feature of quantum processes are oscillations, interference, lots of stationary states in bounded systems and possible asymptotic stationary scattering states in open systems, while the characteristic feature of Markov processes are relaxations to a single stationary state. Quantum processes apply to systems where all variables, that control reversibility, are taken as relevant variables, while Markov processes emerge when some of those variables cannot be followed and are thus irrelevant for the dynamic description. Their absence renders the dynamic irreversible. A further aim is to demonstrate that almost any subdiscipline of theoretical physics can conceptually be put into the context of generated stochastic processes. Classical mechanics and classical field theory are deterministic processes which emerge when fluctuations in relevant variables are negligible. Quantum mechanics and quantum field theory consider genuine quantum processes. Equilibrium and non-equilibrium statistics apply to the regime where relaxing Markov processes emerge from quantum processes by omission of a large number of uncontrollable variables. Systems with many variables often self-organize in such a way that only a few slow variables can serve as relevant variables. Symmetries and topological classes are essential in identifying such relevant variables. The third aim of this book is to provide conceptually general methods of solutions which can serve as starting points to find relevant variables as to apply best-practice approximation methods. Such methods are available through generating functionals. The potential reader is a graduate student who has heard already a course in quantum theory and equilibrium statistical physics including the mathematics of spectral analysis (eigenvalues, eigenvectors, Fourier and Laplace transformation). The reader should be open for a unifying look on several topics.
Probabilistic models have much to offer to philosophy. We continually receive information from a variety of sources: from our senses, from witnesses, from scientific instruments. When considering whether we should believe this information, we assess whether the sources are independent, how reliable they are, and how plausible and coherent the information is. Bovens and Hartmann provide a systematic Bayesian account of these features of reasoning. Simple Bayesian networks allow us to model alternative assumptions about the nature of the information sources. Measurement of the coherence of information is a controversial matter: arguably, the more coherent a set of information is, the more confident we may be that its content is true, other things being equal. The authors offer a new treatment of coherence which respects this claim and shows its relevance to scientific theory choice. Bovens and Hartmann apply this methodology to a wide range of much-discussed issues regarding evidence, testimony, scientific theories and voting. "Bayesian Epistemology" is for anyone working on probabilistic methods in philosophy, and has broad implications for many other disciplines.
This thesis develops a systematic, data-based dynamic modeling framework for industrial processes in keeping with the slowness principle. Using said framework as a point of departure, it then proposes novel strategies for dealing with control monitoring and quality prediction problems in industrial production contexts. The thesis reveals the slowly varying nature of industrial production processes under feedback control, and integrates it with process data analytics to offer powerful prior knowledge that gives rise to statistical methods tailored to industrial data. It addresses several issues of immediate interest in industrial practice, including process monitoring, control performance assessment and diagnosis, monitoring system design, and product quality prediction. In particular, it proposes a holistic and pragmatic design framework for industrial monitoring systems, which delivers effective elimination of false alarms, as well as intelligent self-running by fully utilizing the information underlying the data. One of the strengths of this thesis is its integration of insights from statistics, machine learning, control theory and engineering to provide a new scheme for industrial process modeling in the era of big data.
This book provides a general discussion beneficial to librarians and library school students, and demonstrates the steps of the research process, decisions made in the selection of a statistical technique, how to program a computer to perform number crunching, how to compute those statistical techniques appearing most frequently in the literature of library and information science, and examples from the literature of the uses of different statistical techniques. The book accomplishes the following objectives: to provide an overview of the research process and to show where statistics fit in; to identify journals in library and information science most likely to publish research articles; to identify reference tools that provide access to the research literature; to show how microcomputers can be programmed to engage in number crunching; to introduce basic statistical concepts and terminology; to present basic statistical procedures that appear most frequently in the literature of library and information science and that have application to library decision making; to discuss library decision support systems and show the types of statistical techniques they can perform; and to summarize the major decisions that researchers must address in deciding which statistical techniques to employ.
This contributed volume applies spatial and space-time econometric methods to spatial interaction modeling. The first part of the book addresses general cutting-edge methodological questions in spatial econometric interaction modeling, which concern aspects such as coefficient interpretation, constrained estimation, and scale effects. The second part deals with technical solutions to particular estimation issues, such as intraregional flows, Bayesian PPML and VAR estimation. The final part presents a number of empirical applications, ranging from interregional tourism competition and domestic trade to space-time migration modeling and residential relocation.
Statistical Techniques for Transportation Engineering is written with a systematic approach in mind and covers a full range of data analysis topics, from the introductory level (basic probability, measures of dispersion, random variable, discrete and continuous distributions) through more generally used techniques (common statistical distributions, hypothesis testing), to advanced analysis and statistical modeling techniques (regression, AnoVa, and time series). The book also provides worked out examples and solved problems for a wide variety of transportation engineering challenges.
This book shows how to develop efficient quantitative methods to characterize neural data and extra information that reveals underlying dynamics and neurophysiological mechanisms. Written by active experts in the field, it contains an exchange of innovative ideas among researchers at both computational and experimental ends, as well as those at the interface. Authors discuss research challenges and new directions in emerging areas with two goals in mind: to collect recent advances in statistics, signal processing, modeling, and control methods in neuroscience; and to welcome and foster innovative or cross-disciplinary ideas along this line of research and discuss important research issues in neural data analysis. Making use of both tutorial and review materials, this book is written for neural, electrical, and biomedical engineers; computational neuroscientists; statisticians; computer scientists; and clinical engineers.
This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).
With a foreword by Major-General Nico Geerts, Commander Netherlands Defence Academy, Breda, The Netherlands International conflict resolution increasingly involves the use of non-military power and non-kinetic capabilities alongside military capabilities in the face of hybrid threats. In this book, counter-measures to those threats are addressed by academics with both practical and theoretical experience and knowledge, providing strategic and operational insights into non-kinetic conflict resolution and on the use of power to influence, affect, deter or coerce states and non-state actors. This volume in the NL ARMS series deals with the non-kinetic capabilities to address international crises and conflicts and as always views matters from a global perspective. Included are chapters on the promise, practice and challenges of non-kinetic instruments of power, the instrumentality of soft power, information as a power instrument and manoeuvring in the information environment, Russia's use of deception and misinformation in conflict, applying counter-marketing techniques to fight ISIL, using statistics to profile terrorists, and employing tools such as Actor and Audience Analysis. Such diverse subjects as lawfare, the Law of Armed Conflict rules for non-kinetic cyber attacks, navigation warfare, GPS-spoofing, maritime interception operations, and finally, as a prerequisite, innovative ways for intelligence collection in UN Peacekeeping in Mali come up for discussion.The book will provide both professionals such as (foreign) policy makers and those active in the military services, academics at a master level and those with an interest in military law and the law of armed conflict with useful and up-to-date insights into the wide range of subjects that are contained within it. Paul A.L. Ducheine and Frans P.B. Osinga are General Officers and full professors at the Faculty of Military Sciences of the Netherlands Defence Academy in Breda, The Netherlands. Specific to this volume in the Series: * Written by academics with both practical and theoretical experience* Addresses counter measures to hybrid crises* Offers both strategic and operational insights to non-kinetic conflict resolution
Markov process theory is basically an extension of ordinary
calculus to accommodate functions whos time evolutions are not
entirely deterministic. It is a subject that is becoming
increasingly important for many fields of science. This book
develops the single-variable theory of both continuous and jump
Markov processes in a way that should appeal especially to
physicists and chemists at the senior and graduate level.
The monograph compares two approaches that describe the statistical stability phenomenon - one proposed by the probability theory that ignores violations of statistical stability and another proposed by the theory of hyper-random phenomena that takes these violations into account. There are five parts. The first describes the phenomenon of statistical stability. The second outlines the mathematical foundations of probability theory. The third develops methods for detecting violations of statistical stability and presents the results of experimental research on actual processes of different physical nature that demonstrate the violations of statistical stability over broad observation intervals. The fourth part outlines the mathematical foundations of the theory of hyper-random phenomena. The fifth part discusses the problem of how to provide an adequate description of the world. The monograph should be interest to a wide readership: from university students on a first course majoring in physics, engineering, and mathematics to engineers, post-graduate students, and scientists carrying out research on the statistical laws of natural physical phenomena, developing and using statistical methods for high-precision measurement, prediction, and signal processing over broad observation intervals. To read the book, it is sufficient to be familiar with a standard first university course on mathematics.
This book presents new efficient methods for optimization in realistic large-scale, multi-agent systems. These methods do not require the agents to have the full information about the system, but instead allow them to make their local decisions based only on the local information, possibly obtained during communication with their local neighbors. The book, primarily aimed at researchers in optimization and control, considers three different information settings in multi-agent systems: oracle-based, communication-based, and payoff-based. For each of these information types, an efficient optimization algorithm is developed, which leads the system to an optimal state. The optimization problems are set without such restrictive assumptions as convexity of the objective functions, complicated communication topologies, closed-form expressions for costs and utilities, and finiteness of the system's state space.
This book deals with the number-theoretic properties of almost all real numbers. It brings together many different types of result never covered within the same volume before, thus showing interactions and common ideas between different branches of the subject. It provides an indispensable compendium of basic results, important theorems and open problems. Starting from the classical results of Borel, Khintchine and Weyl, normal numbers, Diophantine approximation and uniform distribution are all discussed. Questions are generalized to higher dimensions and various non-periodic problems are also considered (for example restricting approximation to fractions with prime numerator and denominator). Finally, the dimensions of some of the exceptional sets of measure zero are considered.
This volume presents recent advances in the field of matrix analysis based on contributions at the MAT-TRIAD 2015 conference. Topics covered include interval linear algebra and computational complexity, Birkhoff polynomial basis, tensors, graphs, linear pencils, K-theory and statistic inference, showing the ubiquity of matrices in different mathematical areas. With a particular focus on matrix and operator theory, statistical models and computation, the International Conference on Matrix Analysis and its Applications 2015, held in Coimbra, Portugal, was the sixth in a series of conferences. Applied and Computational Matrix Analysis will appeal to graduate students and researchers in theoretical and applied mathematics, physics and engineering who are seeking an overview of recent problems and methods in matrix analysis.
Simulation has now become an integral part of research and development across many fields of study. Despite the large amounts of literature in the field of simulation and modeling, one recurring problem is the issue of accuracy and confidence level of constructed models. By outlining the new approaches and modern methods of simulation of stochastic processes, this book provides methods and tools in measuring accuracy and reliability in functional spaces. The authors explore analysis of the theory of Sub-Gaussian (including Gaussian one) and Square Gaussian random variables and processes and Cox processes. Methods of simulation of stochastic processes and fields with given accuracy and reliability in some Banach spaces are also considered.
This book was written to serve as a graduate-level textbook for special topics classes in mathematics, statistics, and economics, to introduce these topics to other researchers, and for use in short courses. It is an introduction to the theory of majorization and related notions, and contains detailed material on economic applications of majorization and the Lorenz order, investigating the theoretical aspects of these two interrelated orderings. Revising and expanding on an earlier monograph, Majorization and the Lorenz Order: A Brief Introduction, the authors provide a straightforward development and explanation of majorization concepts, addressing historical development of the topics, and providing up-to-date coverage of families of Lorenz curves. The exposition of multivariate Lorenz orderings sets it apart from existing treatments of these topics. Mathematicians, theoretical statisticians, economists, and other social scientists who already recognize the utility of the Lorenz order in income inequality contexts and arenas will find the book useful for its sound development of relevant concepts rigorously linked to both the majorization literature and the even more extensive body of research on economic applications. Barry C. Arnold, PhD, is Distinguished Professor in the Statistics Department at the University of California, Riverside. He is a Fellow of the American Statistical Society, the American Association for the Advancement of Science, and the Institute of Mathematical Statistics, and is an elected member of the International Statistical Institute. He is the author of more than two hundred publications and eight books. Jose Maria Sarabia, PhD, is Professor of Statistics and Quantitative Methods in Business and Economics in the Department of Economics at the University of Cantabria, Spain. He is author of more than one hundred and fifty publications and ten books and is an associate editor of several journals including TEST, Communications in Statistics, and Journal of Statistical Distributions and Applications.
This book presents the data privacy protection which has been extensively applied in our current era of big data. However, research into big data privacy is still in its infancy. Given the fact that existing protection methods can result in low data utility and unbalanced trade-offs, personalized privacy protection has become a rapidly expanding research topic.In this book, the authors explore emerging threats and existing privacy protection methods, and discuss in detail both the advantages and disadvantages of personalized privacy protection. Traditional methods, such as differential privacy and cryptography, are discussed using a comparative and intersectional approach, and are contrasted with emerging methods like federated learning and generative adversarial nets. The advances discussed cover various applications, e.g. cyber-physical systems, social networks, and location-based services. Given its scope, the book is of interest to scientists, policy-makers, researchers, and postgraduates alike.
This book presents state-of-the-art probabilistic methods for the reliability analysis and design of engineering products and processes. It seeks to facilitate practical application of probabilistic analysis and design by providing an authoritative, in-depth, and practical description of what probabilistic analysis and design is and how it can be implemented. The text is packed with many practical engineering examples (e.g., electric power transmission systems, aircraft power generating systems, and mechanical transmission systems) and exercise problems. It is an up-to-date, fully illustrated reference suitable for both undergraduate and graduate engineering students, researchers, and professional engineers who are interested in exploring the fundamentals, implementation, and applications of probabilistic analysis and design methods.
Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results. Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features: * Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving * More than 430 unique exercises with select solutions * Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.
John E. Freund's Mathematical Statistics with Applications, Eighth Edition, provides a calculus-based introduction to the theory and application of statistics, based on comprehensive coverage that reflects the latest in statistical thinking, the teaching of statistics, and current practices. |
You may like...
|