![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Probability & statistics
Maximum entropy and Bayesian methods have fundamental, central roles in scientific inference, and, with the growing availability of computer power, are being successfully applied in an increasing number of applications in many disciplines. This volume contains selected papers presented at the Thirteenth International Workshop on Maximum Entropy and Bayesian Methods. It includes an extensive tutorial section, and a variety of contributions detailing application in the physical sciences, engineering, law, and economics. Audience: Researchers and other professionals whose work requires the application of practical statistical inference.
Point processes and random measures find wide applicability in telecommunications, earthquakes, image analysis, spatial point patterns, and stereology, to name but a few areas. The authors have made a major reshaping of their work in their first edition of 1988 and now present their Introduction to the Theory of Point Processes in two volumes with sub-titles Elementary Theory and Models and General Theory and Structure. Volume One contains the introductory chapters from the first edition, together with an informal treatment of some of the later material intended to make it more accessible to readers primarily interested in models and applications. The main new material in this volume relates to marked point processes and to processes evolving in time, where the conditional intensity methodology provides a basis for model building, inference, and prediction. There are abundant examples whose purpose is both didactic and to illustrate further applications of the ideas and models that are the main substance of the text. Volume Two returns to the general theory, with additional material on marked and spatial processes. The necessary mathematical background is reviewed in appendices located in Volume One. Daryl Daley is a Senior Fellow in the Centre for Mathematics and Applications at the Australian National University, with research publications in a diverse range of applied probability models and their analysis; he is co-author with Joe Gani of an introductory text in epidemic modelling. David Vere-Jones is an Emeritus Professor at Victoria University of Wellington, widely known for his contributions to Markov chains, point processes, applications in seismology, and statistical education. He is a fellow and Gold Medallist of the Royal Society of New Zealand, and a director of the consulting group "Statistical Research Associates."
This book explains how to analyze independent data from factorial designs without having to make restrictive assumptions, such as normality of the data, or equal variances. The general approach also allows for ordinal and even dichotomous data. The underlying effect size is the nonparametric relative effect, which has a simple and intuitive probability interpretation. The data analysis is presented as comprehensively as possible, including appropriate descriptive statistics which follow a nonparametric paradigm, as well as corresponding inferential methods using hypothesis tests and confidence intervals based on pseudo-ranks. Offering clear explanations, an overview of the modern rank- and pseudo-rank-based inference methodology and numerous illustrations with real data examples, as well as the necessary R/SAS code to run the statistical analyses, this book is a valuable resource for statisticians and practitioners alike.
This monograph is an attempt to unify existing works in the field of random sets, random variables, and linguistic random variables with respect to statistical analysis. It is intended to be a tutorial research compendium. The material of the work is mainly based on the postdoctoral thesis (Ha- bilitationsschrift) of the first author and on several papers recently published by both authors. The methods form the basis of a user-friendly software tool which supports the statistical inferenee in the presence of vague data. Parts of the manuscript have been used in courses for graduate level students of mathematics and eomputer scienees held by the first author at the Technical University of Braunschweig. The textbook is designed for readers with an advanced knowledge of mathematics. The idea of writing this book came from Professor Dr. H. Skala. Several of our students have significantly contributed to its preparation. We would like to express our gratitude to Reinhard Elsner for his support in typesetting the book, Jorg Gebhardt and Jorg Knop for preparing the drawings, Michael Eike and Jiirgen Freckmann for implementing the programming system and Giinter Lehmann and Winfried Boer for proofreading the manuscript. This work was partially supported by the Fraunhofer-Gesellschaft. We are indebted to D. Reidel Publishing Company for making the pub- lication of this book possible and would especially like to acknowledge the support whieh we received from our families on this project.
This book originated from our interest in sea surface temperature variability. Our initial, though entirely pragmatic, goal was to derive adequate mathemat ical tools for handling certain oceanographic problems. Eventually, however, these considerations went far beyond oceanographic applications partly because one of the authors is a mathematician. We found that many theoretical issues of turbulent transport problems had been repeatedly discussed in fields of hy drodynamics, plasma and solid matter physics, and mathematics itself. There are few monographs concerned with turbulent diffusion in the ocean (Csanady 1973, Okubo 1980, Monin and Ozmidov 1988). While selecting material for this book we focused, first, on theoretical issues that could be helpful for understanding mixture processes in the ocean, and, sec ond, on our own contribution to the problem. Mathematically all of the issues addressed in this book are concentrated around a single linear equation: the stochastic advection-diffusion equation. There is no attempt to derive universal statistics for turbulent flow. Instead, the focus is on a statistical description of a passive scalar (tracer) under given velocity statistics. As for applications, this book addresses only one phenomenon: transport of sea surface temperature anomalies. Hopefully, however, our two main approaches are applicable to other subjects."
Small noise is a good noise. In this work, we are interested in the problems of estimation theory concerned with observations of the diffusion-type process Xo = Xo, 0 ~ t ~ T, (0. 1) where W is a standard Wiener process and St(') is some nonanticipative smooth t function. By the observations X = {X , 0 ~ t ~ T} of this process, we will solve some t of the problems of identification, both parametric and nonparametric. If the trend S(-) is known up to the value of some finite-dimensional parameter St(X) = St((}, X), where (} E e c Rd , then we have a parametric case. The nonparametric problems arise if we know only the degree of smoothness of the function St(X), 0 ~ t ~ T with respect to time t. It is supposed that the diffusion coefficient c is always known. In the parametric case, we describe the asymptotical properties of maximum likelihood (MLE), Bayes (BE) and minimum distance (MDE) estimators as c --+ 0 and in the nonparametric situation, we investigate some kernel-type estimators of unknown functions (say, StO,O ~ t ~ T). The asymptotic in such problems of estimation for this scheme of observations was usually considered as T --+ 00 , because this limit is a direct analog to the traditional limit (n --+ 00) in the classical mathematical statistics of i. i. d. observations. The limit c --+ 0 in (0. 1) is interesting for the following reasons.
For surveys involving sensitive questions, randomized response techniques (RRTs) and other indirect questions are helpful in obtaining survey responses while maintaining the privacy of the respondents. Written by one of the leading experts in the world on RR, Randomized Response and Indirect Questioning Techniques in Surveys describes the current state of RR as well as emerging developments in the field. The author also explains how to extend RR to situations employing unequal probability sampling. While the theory of RR has grown phenomenally, the area has not kept pace in practice. Covering both theory and practice, the book first discusses replacing a direct response (DR) with an RR in a simple random sample with replacement (SRSWR). It then emphasizes how the application of RRTs in the estimation of attribute or quantitative features is valid for selecting respondents in a general manner. The author examines different ways to treat maximum likelihood estimation; covers optional RR devices, which provide alternatives to compulsory randomized response theory; and presents RR techniques that encompass quantitative variables, including those related to stigmatizing characteristics. He also gives his viewpoint on alternative RR techniques, including the item count technique, nominative technique, and three-card method.
The book is a collection of research level surveys on certain topics in probability theory, which will be of interest to graduate students and researchers.
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axiomatic approaches. This of course raises the problem of how the abstract calculus of probability should be connected with the actual world of experiments'. It is suggested that this link should be established, not by a definition of probability, but by an application of Popper's concept of falsifiability. In addition to formulating his own interesting theory, Dr Gillies gives a detailed criticism of the generally accepted Neyman Pearson theory of testing, as well as of alternative philosophical approaches to probability theory. The reissue will be of interest both to philosophers with no previous knowledge of probability theory and to mathematicians interested in the foundations of probability theory and statistics.
Principal component analysis is central to the study of multivariate data. Although one of the earliest multivariate techniques it continues to be the subject of much research, ranging from new model- based approaches to algorithmic ideas from neural networks. It is extremely versatile with applications in many disciplines. The first edition of this book was the first comprehensive text written solely on principal component analysis. The second edition updates and substantially expands the original version, and is once again the definitive text on the subject. It includes core material, current research and a wide range of applications. Its length is nearly double that of the first edition. Researchers in statistics, or in other fields that use principal component analysis, will find that the book gives an authoritative yet accessible account of the subject. It is also a valuable resource for graduate courses in multivariate analysis. The book requires some knowledge of matrix algebra. Ian Jolliffe is Professor of Statistics at the University of Aberdeen. He is author or co-author of over 60 research papers and three other books. His research interests are broad, but aspects of principal component analysis have fascinated him and kept him busy for over 30 years.
The volume presents extensive research devoted to a broad spectrum of mathematical analysis and probability theory. Subjects discussed in this Work are those treated in the so-called Strasbourg-Zurich Meetings. These meetings occur twice yearly in each of the cities, Strasbourg and Zurich, venues of vibrant mathematical communication and worldwide gatherings. The topical scope of the book includes the study of monochromatic random waves defined for general Riemannian manifolds, notions of entropy related to a compact manifold of negative curvature, interacting electrons in a random background, lp-cohomology (in degree one) of a graph and its connections with other topics, limit operators for circular ensembles, polyharmonic functions for finite graphs and Markov chains, the ETH-Approach to Quantum Mechanics, 2-dimensional quantum Yang-Mills theory, Gibbs measures of nonlinear Schroedinger equations, interfaces in spectral asymptotics and nodal sets. Contributions in this Work are composed by experts from the international community, who have presented the state-of-the-art research in the corresponding problems treated. This volume is expected to be a valuable resource to both graduate students and research mathematicians working in analysis, probability as well as their interconnections and applications.
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of the state space, where the invariant ergodic probability measures play a significant role. Other topics covered include: characterizations of the supports of various types of invariant probability measures and the use of these to obtain criteria for unique ergodicity, and the proofs of two mean ergodic theorems for a certain type of transition functions. The book will be of interest to mathematicians working in ergodic theory, dynamical systems, or the theory of Markov processes. Biologists, physicists and economists interested in interacting particle systems and rigorous mathematics will also find this book a valuable resource. Parts of it are suitable for advanced graduate courses. Prerequisites are basic notions and results on functional analysis, general topology, measure theory, the Bochner integral and some of its applications.
This book has a dual purpose. One of these is to present material which selec tively will be appropriate for a quarter or semester course in time series analysis and which will cover both the finite parameter and spectral approach. The second object is the presentation of topics of current research interest and some open questions. I mention these now. In particular, there is a discussion in Chapter III of the types of limit theorems that will imply asymptotic nor mality for covariance estimates and smoothings of the periodogram. This dis cussion allows one to get results on the asymptotic distribution of finite para meter estimates that are broader than those usually given in the literature in Chapter IV. A derivation of the asymptotic distribution for spectral (second order) estimates is given under an assumption of strong mixing in Chapter V. A discussion of higher order cumulant spectra and their large sample properties under appropriate moment conditions follows in Chapter VI. Probability density, conditional probability density and regression estimates are considered in Chapter VII under conditions of short range dependence. Chapter VIII deals with a number of topics. At first estimates for the structure function of a large class of non-Gaussian linear processes are constructed. One can determine much more about this structure or transfer function in the non-Gaussian case than one can for Gaussian processes. In particular, one can determine almost all the phase information."
This book is an English translation of the last French edition of Bourbaki’s Fonctions d'une Variable Réelle. The first chapter is devoted to derivatives, Taylor expansions, the finite increments theorem, convex functions. In the second chapter, primitives and integrals (on arbitrary intervals) are studied, as well as their dependence with respect to parameters. Classical functions (exponential, logarithmic, circular and inverse circular) are investigated in the third chapter. The fourth chapter gives a thorough treatment of differential equations (existence and unicity properties of solutions, approximate solutions, dependence on parameters) and of systems of linear differential equations. The local study of functions (comparison relations, asymptotic expansions) is treated in chapter V, with an appendix on Hardy fields. The theory of generalized Taylor expansions and the Euler-MacLaurin formula are presented in the sixth chapter, and applied in the last one to the study of the Gamma function on the real line as well as on the complex plane. Although the topics of the book are mainly of an advanced undergraduate level, they are presented in the generality needed for more advanced purposes: functions allowed to take values in topological vector spaces, asymptotic expansions are treated on a filtered set equipped with a comparison scale, theorems on the dependence on parameters of differential equations are directly applicable to the study of flows of vector fields on differential manifolds, etc.
Scientists and engineers often have to deal with systems that exhibit random or unpredictable elements and must effectively evaluate probabilities in each situation. Computer simulations, while the traditional tool used to solve such problems, are limited in the scale and complexity of the problems they can solve. Formalized Probability Theory and Applications Using Theorem Proving discusses some of the limitations inherent in computer systems when applied to problems of probabilistic analysis, and presents a novel solution to these limitations, combining higher-order logic with computer-based theorem proving. Combining practical application with theoretical discussion, this book is an important reference tool for mathematicians, scientists, engineers, and researchers in all STEM fields.
This book introduces the basic concepts and methods that are useful in the statistical analysis and modeling of the DNA-based marker and phenotypic data that arise in agriculture, forestry, experimental biology, and other fields. It concentrates on the linkage analysis of markers, map construction and quantitative trait locus (QTL) mapping, and assumes a background in regression analysis and maximum likelihood approaches. The strength of this book lies in the construction of general models and algorithms for linkage analysis, as well as in QTL mapping in any kind of crossed pedigrees initiated with inbred lines of crops.
A self-contained treatment of stochastic processes arising from models for queues, insurance risk, and dams and data communication, using their sample function properties. The approach is based on the fluctuation theory of random walks, L vy processes, and Markov-additive processes, in which Wiener-Hopf factorisation plays a central role. This second edition includes results for the virtual waiting time and queue length in single server queues, while the treatment of continuous time storage processes is thoroughly revised and simplified. With its prerequisite of a graduate-level course in probability and stochastic processes, this book can be used as a text for an advanced course on applied probability models.
This volume is devoted to stochastic and chaotic oscillations in dissipative systems. It first deals with mathematical models of deterministic, discrete and distributed dynamical systems. It then considers the two basic trends of order and chaos, and describes stochasticity transformers, amplifiers and generators, turbulence and phase portraits of steady-state motions and their bifurcations. The books also treats the topics of stochastic and chaotic attractors, as well as the routes to chaos and the quantitative characteristics of stochastic and chaotic motions. Finally, in a chapter which comprises more than one-third of the book, examples are presented of systems having chaotic and stochastic motions drawn from mechanical, physical, chemical and biological systems.
Clustered survival data are encountered in many scientific disciplines including human and veterinary medicine, biology, epidemiology, public health, and demography. Frailty models provide a powerful tool to analyze clustered survival data. In contrast to the large number of research publications on frailty models, relatively few statistical software packages contain frailty models. It is difficult for statistical practitioners and graduate students to understand frailty models from the existing literature. This book provides an in-depth discussion and explanation of the basics of frailty model methodology for such readers. accelerated failure time models. Common techniques to fit frailty models include the EM-algorithm, penalized likelihood techniques, Laplacian integration and Bayesian techniques. More advanced frailty models for hierarchical data are also included.Real-life examples are used to demonstrate how particular frailty models can be fitted and how the results should be interpreted. the Springer website with most of the programs developed in the freeware packages R and Winbugs. The book starts with a brief overview of some basic concepts in classical survival analysis, collecting what is needed for the reading on the more complex frailty models.
This volume has its origin in the Seventeenth International Workshop on Maximum Entropy and Bayesian Methods, MAXENT 97. The workshop was held at Boise State University in Boise, Idaho, on August 4 -8, 1997. As in the past, the purpose of the workshop was to bring together researchers in different fields to present papers on applications of Bayesian methods (these include maximum entropy) in science, engineering, medicine, economics, and many other disciplines. Thanks to significant theoretical advances and the personal computer, much progress has been made since our first Workshop in 1981. As indicated by several papers in these proceedings, the subject has matured to a stage in which computational algorithms are the objects of interest, the thrust being on feasibility, efficiency and innovation. Though applications are proliferating at a staggering rate, some in areas that hardly existed a decade ago, it is pleasing that due attention is still being paid to foundations of the subject. The following list of descriptors, applicable to papers in this volume, gives a sense of its contents: deconvolution, inverse problems, instrument (point-spread) function, model comparison, multi sensor data fusion, image processing, tomography, reconstruction, deformable models, pattern recognition, classification and group analysis, segmentation/edge detection, brain shape, marginalization, algorithms, complexity, Ockham's razor as an inference tool, foundations of probability theory, symmetry, history of probability theory and computability. MAXENT 97 and these proceedings could not have been brought to final form without the support and help of a number of people.
This book is comprised of the presentations delivered at the 25th ICSA Applied Statistics Symposium held at the Hyatt Regency Atlanta, on June 12-15, 2016. This symposium attracted more than 700 statisticians and data scientists working in academia, government, and industry from all over the world. The theme of this conference was the "Challenge of Big Data and Applications of Statistics," in recognition of the advent of big data era, and the symposium offered opportunities for learning, receiving inspirations from old research ideas and for developing new ones, and for promoting further research collaborations in the data sciences. The invited contributions addressed rich topics closely related to big data analysis in the data sciences, reflecting recent advances and major challenges in statistics, business statistics, and biostatistics. Subsequently, the six editors selected 19 high-quality presentations and invited the speakers to prepare full chapters for this book, which showcases new methods in statistics and data sciences, emerging theories, and case applications from statistics, data science and interdisciplinary fields. The topics covered in the book are timely and have great impact on data sciences, identifying important directions for future research, promoting advanced statistical methods in big data science, and facilitating future collaborations across disciplines and between theory and practice.
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
By assuming it is possible to understand regression analysis without fully comprehending all its underlying proofs and theories, this introduction to the widely used statistical technique is accessible to readers who may have only a rudimentary knowledge of mathematics. Chapters discuss: -descriptive statistics using vector notation and the components
of a simple regression model;
'Et moi *...* si j'avait su comment en rcvenir. One service mathematics has rendered the je n'y serais point alle.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canistcr labelled 'discarded non- sense'. The scries is divergent; therefore we may be Eric T. Bell able to do something with it. O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non- linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics ...'; 'One service logic has rendered com- puter science ...'; 'One service category theory has rendered mathematics ...'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series. |
![]() ![]() You may like...
Time Series Analysis - With Applications…
Jonathan D. Cryer, Kung-Sik Chan
Hardcover
R2,669
Discovery Miles 26 690
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
Pearson Edexcel International A Level…
Joe Skrakowski, Harry Smith
Paperback
R948
Discovery Miles 9 480
Stats: Data and Models, Global Edition…
Richard De Veaux, Paul Velleman, …
Digital product license key
R1,589
Discovery Miles 15 890
Integrated Population Biology and…
Arni S.R. Srinivasa Rao, C.R. Rao
Hardcover
R6,461
Discovery Miles 64 610
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu
Paperback
Mathematical Statistics with…
William Mendenhall, Dennis Wackerly, …
Paperback
|