![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
This book presents Markov and quantum processes as two sides of a coin called generated stochastic processes. It deals with quantum processes as reversible stochastic processes generated by one-step unitary operators, while Markov processes are irreversible stochastic processes generated by one-step stochastic operators. The characteristic feature of quantum processes are oscillations, interference, lots of stationary states in bounded systems and possible asymptotic stationary scattering states in open systems, while the characteristic feature of Markov processes are relaxations to a single stationary state. Quantum processes apply to systems where all variables, that control reversibility, are taken as relevant variables, while Markov processes emerge when some of those variables cannot be followed and are thus irrelevant for the dynamic description. Their absence renders the dynamic irreversible. A further aim is to demonstrate that almost any subdiscipline of theoretical physics can conceptually be put into the context of generated stochastic processes. Classical mechanics and classical field theory are deterministic processes which emerge when fluctuations in relevant variables are negligible. Quantum mechanics and quantum field theory consider genuine quantum processes. Equilibrium and non-equilibrium statistics apply to the regime where relaxing Markov processes emerge from quantum processes by omission of a large number of uncontrollable variables. Systems with many variables often self-organize in such a way that only a few slow variables can serve as relevant variables. Symmetries and topological classes are essential in identifying such relevant variables. The third aim of this book is to provide conceptually general methods of solutions which can serve as starting points to find relevant variables as to apply best-practice approximation methods. Such methods are available through generating functionals. The potential reader is a graduate student who has heard already a course in quantum theory and equilibrium statistical physics including the mathematics of spectral analysis (eigenvalues, eigenvectors, Fourier and Laplace transformation). The reader should be open for a unifying look on several topics.
This book was written to serve as a graduate-level textbook for special topics classes in mathematics, statistics, and economics, to introduce these topics to other researchers, and for use in short courses. It is an introduction to the theory of majorization and related notions, and contains detailed material on economic applications of majorization and the Lorenz order, investigating the theoretical aspects of these two interrelated orderings. Revising and expanding on an earlier monograph, Majorization and the Lorenz Order: A Brief Introduction, the authors provide a straightforward development and explanation of majorization concepts, addressing historical development of the topics, and providing up-to-date coverage of families of Lorenz curves. The exposition of multivariate Lorenz orderings sets it apart from existing treatments of these topics. Mathematicians, theoretical statisticians, economists, and other social scientists who already recognize the utility of the Lorenz order in income inequality contexts and arenas will find the book useful for its sound development of relevant concepts rigorously linked to both the majorization literature and the even more extensive body of research on economic applications. Barry C. Arnold, PhD, is Distinguished Professor in the Statistics Department at the University of California, Riverside. He is a Fellow of the American Statistical Society, the American Association for the Advancement of Science, and the Institute of Mathematical Statistics, and is an elected member of the International Statistical Institute. He is the author of more than two hundred publications and eight books. Jose Maria Sarabia, PhD, is Professor of Statistics and Quantitative Methods in Business and Economics in the Department of Economics at the University of Cantabria, Spain. He is author of more than one hundred and fifty publications and ten books and is an associate editor of several journals including TEST, Communications in Statistics, and Journal of Statistical Distributions and Applications.
This book includes the texts of the survey lectures given by plenary speakers at the 11th International ISAAC Congress held in Vaxjoe, Sweden, on 14-18 August, 2017. It is the purpose of ISAAC to promote analysis, its applications, and its interaction with computation. Analysis is understood here in the broad sense of the word, including differential equations, integral equations, functional analysis, and function theory. With this objective, ISAAC organizes international Congresses for the presentation and discussion of research on analysis. The plenary lectures in the present volume, authored by eminent specialists, are devoted to some exciting recent developments, topics including: local solvability for subprincipal type operators; fractional-order Laplacians; degenerate complex vector fields in the plane; lower bounds for pseudo-differential operators; a survey on Morrey spaces; localization operators in Signal Theory and Quantum Mechanics. Thanks to the accessible style used, readers only need a basic command of Calculus. This book will appeal to scientists, teachers, and graduate students in Mathematics, in particular Mathematical Analysis, Probability and Statistics, Numerical Analysis and Mathematical Physics.
Probabilistic models have much to offer to philosophy. We continually receive information from a variety of sources: from our senses, from witnesses, from scientific instruments. When considering whether we should believe this information, we assess whether the sources are independent, how reliable they are, and how plausible and coherent the information is. Bovens and Hartmann provide a systematic Bayesian account of these features of reasoning. Simple Bayesian networks allow us to model alternative assumptions about the nature of the information sources. Measurement of the coherence of information is a controversial matter: arguably, the more coherent a set of information is, the more confident we may be that its content is true, other things being equal. The authors offer a new treatment of coherence which respects this claim and shows its relevance to scientific theory choice. Bovens and Hartmann apply this methodology to a wide range of much-discussed issues regarding evidence, testimony, scientific theories and voting. "Bayesian Epistemology" is for anyone working on probabilistic methods in philosophy, and has broad implications for many other disciplines.
This book provides a general discussion beneficial to librarians and library school students, and demonstrates the steps of the research process, decisions made in the selection of a statistical technique, how to program a computer to perform number crunching, how to compute those statistical techniques appearing most frequently in the literature of library and information science, and examples from the literature of the uses of different statistical techniques. The book accomplishes the following objectives: to provide an overview of the research process and to show where statistics fit in; to identify journals in library and information science most likely to publish research articles; to identify reference tools that provide access to the research literature; to show how microcomputers can be programmed to engage in number crunching; to introduce basic statistical concepts and terminology; to present basic statistical procedures that appear most frequently in the literature of library and information science and that have application to library decision making; to discuss library decision support systems and show the types of statistical techniques they can perform; and to summarize the major decisions that researchers must address in deciding which statistical techniques to employ.
This thesis develops a systematic, data-based dynamic modeling framework for industrial processes in keeping with the slowness principle. Using said framework as a point of departure, it then proposes novel strategies for dealing with control monitoring and quality prediction problems in industrial production contexts. The thesis reveals the slowly varying nature of industrial production processes under feedback control, and integrates it with process data analytics to offer powerful prior knowledge that gives rise to statistical methods tailored to industrial data. It addresses several issues of immediate interest in industrial practice, including process monitoring, control performance assessment and diagnosis, monitoring system design, and product quality prediction. In particular, it proposes a holistic and pragmatic design framework for industrial monitoring systems, which delivers effective elimination of false alarms, as well as intelligent self-running by fully utilizing the information underlying the data. One of the strengths of this thesis is its integration of insights from statistics, machine learning, control theory and engineering to provide a new scheme for industrial process modeling in the era of big data.
Simulation has now become an integral part of research and development across many fields of study. Despite the large amounts of literature in the field of simulation and modeling, one recurring problem is the issue of accuracy and confidence level of constructed models. By outlining the new approaches and modern methods of simulation of stochastic processes, this book provides methods and tools in measuring accuracy and reliability in functional spaces. The authors explore analysis of the theory of Sub-Gaussian (including Gaussian one) and Square Gaussian random variables and processes and Cox processes. Methods of simulation of stochastic processes and fields with given accuracy and reliability in some Banach spaces are also considered.
This book presents state-of-the-art probabilistic methods for the reliability analysis and design of engineering products and processes. It seeks to facilitate practical application of probabilistic analysis and design by providing an authoritative, in-depth, and practical description of what probabilistic analysis and design is and how it can be implemented. The text is packed with many practical engineering examples (e.g., electric power transmission systems, aircraft power generating systems, and mechanical transmission systems) and exercise problems. It is an up-to-date, fully illustrated reference suitable for both undergraduate and graduate engineering students, researchers, and professional engineers who are interested in exploring the fundamentals, implementation, and applications of probabilistic analysis and design methods.
Modeling Uncertainty: An Examination of Stochastic Theory, Methods, and Applications, is a volume undertaken by the friends and colleagues of Sid Yakowitz in his honor. Fifty internionally known scholars have collectively contributed 30 papers on modeling uncertainty to this volume. Each of these papers was carefully reviewed and in the majority of cases the original submission was revised before being accepted for publication in the book. The papers cover a great variety of topics in probability, statistics, economics, stochastic optimization, control theory, regression analysis, simulation, stochastic programming, Markov decision process, application in the HIV context, and others. There are papers with a theoretical emphasis and others that focus on applications. A number of papers survey the work in a particular area and in a few papers the authors present their personal view of a topic. It is a book with a considerable number of expository articles, which are accessible to a nonexpert - a graduate student in mathematics, statistics, engineering, and economics departments, or just anyone with some mathematical background who is interested in a preliminary exposition of a particular topic. Many of the papers present the state of the art of a specific area or represent original contributions which advance the present state of knowledge. In sum, it is a book of considerable interest to a broad range of academic researchers and students of stochastic systems.
Markov process theory is basically an extension of ordinary
calculus to accommodate functions whos time evolutions are not
entirely deterministic. It is a subject that is becoming
increasingly important for many fields of science. This book
develops the single-variable theory of both continuous and jump
Markov processes in a way that should appeal especially to
physicists and chemists at the senior and graduate level.
The monograph compares two approaches that describe the statistical stability phenomenon - one proposed by the probability theory that ignores violations of statistical stability and another proposed by the theory of hyper-random phenomena that takes these violations into account. There are five parts. The first describes the phenomenon of statistical stability. The second outlines the mathematical foundations of probability theory. The third develops methods for detecting violations of statistical stability and presents the results of experimental research on actual processes of different physical nature that demonstrate the violations of statistical stability over broad observation intervals. The fourth part outlines the mathematical foundations of the theory of hyper-random phenomena. The fifth part discusses the problem of how to provide an adequate description of the world. The monograph should be interest to a wide readership: from university students on a first course majoring in physics, engineering, and mathematics to engineers, post-graduate students, and scientists carrying out research on the statistical laws of natural physical phenomena, developing and using statistical methods for high-precision measurement, prediction, and signal processing over broad observation intervals. To read the book, it is sufficient to be familiar with a standard first university course on mathematics.
This is a comprehensive survey on the research on the parabolic Anderson model - the heat equation with random potential or the random walk in random potential - of the years 1990 - 2015. The investigation of this model requires a combination of tools from probability (large deviations, extreme-value theory, e.g.) and analysis (spectral theory for the Laplace operator with potential, variational analysis, e.g.). We explain the background, the applications, the questions and the connections with other models and formulate the most relevant results on the long-time behavior of the solution, like quenched and annealed asymptotics for the total mass, intermittency, confinement and concentration properties and mass flow. Furthermore, we explain the most successful proof methods and give a list of open research problems. Proofs are not detailed, but concisely outlined and commented; the formulations of some theorems are slightly simplified for better comprehension.
This book collects research papers on the philosophical foundations of probability, causality, spacetime and quantum theory. The papers are related to talks presented in six subsequent workshops organized by The Budapest-Krakow Research Group on Probability, Causality and Determinism. Coverage consists of three parts. Part I focuses on the notion of probability from a general philosophical and formal epistemological perspective. Part II applies probabilistic considerations to address causal questions in the foundations of quantum mechanics. Part III investigates the question of indeterminism in spacetime theories. It also explores some related questions, such as decidability and observation. The contributing authors are all philosophers of science with a strong background in mathematics or physics. They believe that paying attention to the finer formal details often helps avoiding pitfalls that exacerbate the philosophical problems that are in the center of focus of contemporary research. The papers presented here help make explicit the mathematical-structural assumptions that underlie key philosophical argumentations. This formally rigorous and conceptually precise approach will appeal to researchers and philosophers as well as mathematicians and statisticians.
John E. Freund's Mathematical Statistics with Applications, Eighth Edition, provides a calculus-based introduction to the theory and application of statistics, based on comprehensive coverage that reflects the latest in statistical thinking, the teaching of statistics, and current practices.
This book deals with the number-theoretic properties of almost all real numbers. It brings together many different types of result never covered within the same volume before, thus showing interactions and common ideas between different branches of the subject. It provides an indispensable compendium of basic results, important theorems and open problems. Starting from the classical results of Borel, Khintchine and Weyl, normal numbers, Diophantine approximation and uniform distribution are all discussed. Questions are generalized to higher dimensions and various non-periodic problems are also considered (for example restricting approximation to fractions with prime numerator and denominator). Finally, the dimensions of some of the exceptional sets of measure zero are considered.
This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and government organizations.
This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).
This book shows how to develop efficient quantitative methods to characterize neural data and extra information that reveals underlying dynamics and neurophysiological mechanisms. Written by active experts in the field, it contains an exchange of innovative ideas among researchers at both computational and experimental ends, as well as those at the interface. Authors discuss research challenges and new directions in emerging areas with two goals in mind: to collect recent advances in statistics, signal processing, modeling, and control methods in neuroscience; and to welcome and foster innovative or cross-disciplinary ideas along this line of research and discuss important research issues in neural data analysis. Making use of both tutorial and review materials, this book is written for neural, electrical, and biomedical engineers; computational neuroscientists; statisticians; computer scientists; and clinical engineers.
Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results. Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features: * Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving * More than 430 unique exercises with select solutions * Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations - the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers and scientists in general who study the phenomenon of statistical stability and use statistical methods for high-precision measurements, prediction, and signal processing over long observation intervals.
This is the second of a two-part guide to quantitative analysis using the IBM SPSS Statistics software package; this volume focuses on multivariate statistical methods and advanced forecasting techniques. More often than not, regression models involve more than one independent variable. For example, forecasting methods are commonly applied to aggregates such as inflation rates, unemployment, exchange rates, etc., that have complex relationships with determining variables. This book introduces multivariate regression models and provides examples to help understand theory underpinning the model. The book presents the fundamentals of multivariate regression and then moves on to examine several related techniques that have application in business-orientated fields such as logistic and multinomial regression. Forecasting tools such as the Box-Jenkins approach to time series modeling are introduced, as well as exponential smoothing and naive techniques. This part also covers hot topics such as Factor Analysis, Discriminant Analysis and Multidimensional Scaling (MDS).
The Conference on Traffic and Granular Flow brings together international researchers from different fields ranging from physics to computer science and engineering to discuss the latest developments in traffic-related systems. Originally conceived to facilitate new ideas by considering the similarities of traffic and granular flow, TGF'15, organised by Delft University of Technology, now covers a broad range of topics related to driven particle and transport systems. Besides the classical topics of granular flow and highway traffic, its scope includes data transport (Internet traffic), pedestrian and evacuation dynamics, intercellular transport, swarm behaviour and the collective dynamics of other biological systems. Recent advances in modelling, computer simulation and phenomenology are presented, and prospects for applications, for example to traffic control, are discussed. The conference explores the interrelations between the above-mentioned fields and offers the opportunity to stimulate interdisciplinary research, exchange ideas, and meet many experts in these areas of research.
This book presents new efficient methods for optimization in realistic large-scale, multi-agent systems. These methods do not require the agents to have the full information about the system, but instead allow them to make their local decisions based only on the local information, possibly obtained during communication with their local neighbors. The book, primarily aimed at researchers in optimization and control, considers three different information settings in multi-agent systems: oracle-based, communication-based, and payoff-based. For each of these information types, an efficient optimization algorithm is developed, which leads the system to an optimal state. The optimization problems are set without such restrictive assumptions as convexity of the objective functions, complicated communication topologies, closed-form expressions for costs and utilities, and finiteness of the system's state space.
This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering. Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanatory frameworks, but become powerful, quantitative data-analytical tools in themselves that enable researchers to look beyond the data surface and unravel underlying mechanisms. Interactive examples of most methods are provided through a package of MatLab routines, encouraging a playful approach to the subject, and providing readers with a better feel for the practical aspects of the methods covered. "Computational neuroscience is essential for integrating and providing a basis for understanding the myriads of remarkable laboratory data on nervous system functions. Daniel Durstewitz has excellently covered the breadth of computational neuroscience from statistical interpretations of data to biophysically based modeling of the neurobiological sources of those data. His presentation is clear, pedagogically sound, and readily useable by experts and beginners alike. It is a pleasure to recommend this very well crafted discussion to experimental neuroscientists as well as mathematically well versed Physicists. The book acts as a window to the issues, to the questions, and to the tools for finding the answers to interesting inquiries about brains and how they function." Henry D. I. Abarbanel Physics and Scripps Institution of Oceanography, University of California, San Diego "This book delivers a clear and thorough introduction to sophisticated analysis approaches useful in computational neuroscience. The models described and the examples provided will help readers develop critical intuitions into what the methods reveal about data. The overall approach of the book reflects the extensive experience Prof. Durstewitz has developed as a leading practitioner of computational neuroscience. " Bruno B. Averbeck
This volume presents recent advances in the field of matrix analysis based on contributions at the MAT-TRIAD 2015 conference. Topics covered include interval linear algebra and computational complexity, Birkhoff polynomial basis, tensors, graphs, linear pencils, K-theory and statistic inference, showing the ubiquity of matrices in different mathematical areas. With a particular focus on matrix and operator theory, statistical models and computation, the International Conference on Matrix Analysis and its Applications 2015, held in Coimbra, Portugal, was the sixth in a series of conferences. Applied and Computational Matrix Analysis will appeal to graduate students and researchers in theoretical and applied mathematics, physics and engineering who are seeking an overview of recent problems and methods in matrix analysis. |
You may like...
A Golden Treasury for the Children of…
Carl Heinrich Von Bogatzky
Paperback
R570
Discovery Miles 5 700
|