![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Probability & statistics
This new edition is a concise introduction to the basic methods of computational physics. Readers will discover the benefits of numerical methods for solving complex mathematical problems and for the direct simulation of physical processes. The book is divided into two main parts: Deterministic methods and stochastic methods in computational physics. Based on concrete problems, the first part discusses numerical differentiation and integration, as well as the treatment of ordinary differential equations. This is extended by a brief introduction to the numerics of partial differential equations. The second part deals with the generation of random numbers, summarizes the basics of stochastics, and subsequently introduces Monte-Carlo (MC) methods. Specific emphasis is on MARKOV chain MC algorithms. The final two chapters discuss data analysis and stochastic optimization. All this is again motivated and augmented by applications from physics. In addition, the book offers a number of appendices to provide the reader with information on topics not discussed in the main text. Numerous problems with worked-out solutions, chapter introductions and summaries, together with a clear and application-oriented style support the reader. Ready to use C++ codes are provided online.
This book presents state-of-the-art solution methods and applications of stochastic optimal control. It is a collection of extended papers discussed at the traditional Liverpool workshop on controlled stochastic processes with participants from both the east and the west. New problems are formulated, and progresses of ongoing research are reported. Topics covered in this book include theoretical results and numerical methods for Markov and semi-Markov decision processes, optimal stopping of Markov processes, stochastic games, problems with partial information, optimal filtering, robust control, Q-learning, and self-organizing algorithms. Real-life case studies and applications, e.g., queueing systems, forest management, control of water resources, marketing science, and healthcare, are presented. Scientific researchers and postgraduate students interested in stochastic optimal control,- as well as practitioners will find this book appealing and a valuable reference.
Since the early eighties, Ali Suleyman Ustunelhas beenone of the
main contributors to the field of Malliavin calculus. In a workshop
held in Paris, June 2010 several prominent researchers gave
exciting talks in honor of his 60th birthday. The present volume
includes scientific contributions from this workshop.
This book compiles and presents new developments in statistical causal inference. The accompanying data and computer programs are publicly available so readers may replicate the model development and data analysis presented in each chapter. In this way, methodology is taught so that readers may implement it directly. The book brings together experts engaged in causal inference research to present and discuss recent issues in causal inference methodological development. This is also a timely look at causal inference applied to scenarios that range from clinical trials to mediation and public health research more broadly. In an academic setting, this book will serve as a reference and guide to a course in causal inference at the graduate level (Master's or Doctorate). It is particularly relevant for students pursuing degrees in statistics, biostatistics, and computational biology. Researchers and data analysts in public health and biomedical research will also find this book to be an important reference.
A ground-breaking and practical treatment of probability and stochastic processes "A Modern Theory of Random Variation" is a new and radical re-formulation of the mathematical underpinnings of subjects as diverse as investment, communication engineering, and quantum mechanics. Setting aside the classical theory of probability measure spaces, the book utilizes a mathematically rigorous version of the theory of random variation that bases itself exclusively on finitely additive probability distribution functions. In place of twentieth century Lebesgue integration and measure theory, the author uses the simpler concept of Riemann sums, and the non-absolute Riemann-type integration of Henstock. Readers are supplied with an accessible approach to standard elements of probability theory such as the central limmit theorem and Brownian motion as well as remarkable, new results on Feynman diagrams and stochastic integrals. Throughout the book, detailed numerical demonstrations accompany the discussions of abstract mathematical theory, from the simplest elements of the subject to the most complex. In addition, an array of numerical examples and vivid illustrations showcase how the presented methods and applications can be undertaken at various levels of complexity. "A Modern Theory of Random Variation" is a suitable book for courses on mathematical analysis, probability theory, and mathematical finance at the upper-undergraduate and graduate levels. The book is also an indispensible resource for researchers and practitioners who are seeking new concepts, techniques and methodologies in data analysis, numerical calculation, and financial asset valuation. Patrick Muldowney, PhD, served as lecturer at the Magee Business School of the UNiversity of Ulster for over twenty years. Dr. Muldowney has published extensively in his areas of research, including integration theory, financial mathematics, and random variation.
Presents a unique study of Integrative Problem-Solving (IPS). The consideration of 'Decadence' is essential in the scientific study of environmental and other problems and their rigorous solution, because the broad context within which the problems emerge can affect their solution. Stochastic reasoning underlines the conceptual and methodological framework of IPS, and its formulation has a mathematical life of its own that accounts for the multidisciplinarity of real world problems, the multisourced uncertainties characterizing their solution, and the different thinking modes of the people involved. Only by interpolating between the full range of disciplines (including stochastic mathematics, physical science, neuropsychology, philosophy, and sociology) and the associated thinking modes can scientists arrive at a satisfactory account of problem-solving, and be able to distinguish between a technically complete problem-solution, and a solution that has social impact.
This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.
This book presents various recently developed and traditional statistical techniques, which are increasingly being applied in social science research. The social sciences cover diverse phenomena arising in society, the economy and the environment, some of which are too complex to allow concrete statements; some cannot be defined by direct observations or measurements; some are culture- (or region-) specific, while others are generic and common. Statistics, being a scientific method - as distinct from a 'science' related to any one type of phenomena - is used to make inductive inferences regarding various phenomena. The book addresses both qualitative and quantitative research (a combination of which is essential in social science research) and offers valuable supplementary reading at an advanced level for researchers.
This proceedings book highlights the latest research and developments in psychometrics and statistics. Featuring contributions presented at the 82nd Annual Meeting of the Psychometric Society (IMPS), organized by the University of Zurich and held in Zurich, Switzerland from July 17 to 21, 2017, its 34 chapters address a diverse range of psychometric topics including item response theory, factor analysis, causal inference, Bayesian statistics, test equating, cognitive diagnostic models and multistage adaptive testing. The IMPS is one of the largest international meetings on quantitative measurement in psychology, education and the social sciences, attracting over 500 participants and 250 paper presentations from around the world every year. This book gathers the contributions of selected presenters, which were subsequently expanded and peer-reviewed.
This book provides a general framework for learning sparse graphical models with conditional independence tests. It includes complete treatments for Gaussian, Poisson, multinomial, and mixed data; unified treatments for covariate adjustments, data integration, and network comparison; unified treatments for missing data and heterogeneous data; efficient methods for joint estimation of multiple graphical models; effective methods of high-dimensional variable selection; and effective methods of high-dimensional inference. The methods possess an embarrassingly parallel structure in performing conditional independence tests, and the computation can be significantly accelerated by running in parallel on a multi-core computer or a parallel architecture. This book is intended to serve researchers and scientists interested in high-dimensional statistics, and graduate students in broad data science disciplines. Key Features: A general framework for learning sparse graphical models with conditional independence tests Complete treatments for different types of data, Gaussian, Poisson, multinomial, and mixed data Unified treatments for data integration, network comparison, and covariate adjustment Unified treatments for missing data and heterogeneous data Efficient methods for joint estimation of multiple graphical models Effective methods of high-dimensional variable selection Effective methods of high-dimensional inference
In real-life decision-making situations it is necessary to make decisions with incomplete information, for oftentimes uncertain results. In "Decision-Making Under Uncertainty," Dr. Chacko applies his years of statistical research and experience to the analysis of twenty-four real-life decision-making situations, both those with few data points (eg: Cuban Missile Crisis), and many data points (eg: aspirin for heart attack prevention). These situations encompass decision-making in a variety of business, social and political, physical and biological, and military environments. Though different, all of these have one characteristic in common: their outcomes are uncertain/unkown, and unknowable. Chacko Demonstrates how the decision-maker can reduce uncertainty by choosing probable outcomes using the statistical methods he introduces. This detailed volume develops standard statistical concepts (t, x2, normal distribution, ANOVA), and the less familiar concepts (logical probability, subjective probability, Bayesian Inference, Penalty for Non-Fulfillment, Bluff-Threats Matrix, etc.). Chacko also offers a thorough discussion of the underlying theoretical principles. The end of each chapter contains a set of questions, three quarters of which focus on concepts, formulation, conclusion, resource commitments, and caveats; only one quarter with computations. Ideal for the practitioner, the work is also designed to serve as the primary text for graduate or advanced undergraduate courses in statistics and decision science.
This book presents the proceedings of the international conference Particle Systems and Partial Differential Equations I, which took place at the Centre of Mathematics of the University of Minho, Braga, Portugal, from the 5th to the 7th of December, 2012. The purpose of the conference was to bring together world leaders to discuss their topics of expertise and to present some of their latest research developments in those fields. Among the participants were researchers in probability, partial differential equations and kinetics theory. The aim of the meeting was to present to a varied public the subject of interacting particle systems, its motivation from the viewpoint of physics and its relation with partial differential equations or kinetics theory and to stimulate discussions and possibly new collaborations among researchers with different backgrounds. The book contains lecture notes written by Francois Golse on the derivation of hydrodynamic equations (compressible and incompressible Euler and Navier-Stokes) from the Boltzmann equation, and several short papers written by some of the participants in the conference. Among the topics covered by the short papers are hydrodynamic limits; fluctuations; phase transitions; motions of shocks and anti shocks in exclusion processes; large number asymptotics for systems with self-consistent coupling; quasi-variational inequalities; unique continuation properties for PDEs and others. The book will benefit probabilists, analysts and mathematicians who are interested in statistical physics, stochastic processes, partial differential equations and kinetics theory, along with physicists."
This book focuses on three core knowledge requirements for effective and thorough data analysis for solving business problems. These are a foundational understanding of: 1. statistical, econometric, and machine learning techniques; 2. data handling capabilities; 3. at least one programming language. Practical in orientation, the volume offers illustrative case studies throughout and examples using Python in the context of Jupyter notebooks. Covered topics include demand measurement and forecasting, predictive modeling, pricing analytics, customer satisfaction assessment, market and advertising research, and new product development and research. This volume will be useful to business data analysts, data scientists, and market research professionals, as well as aspiring practitioners in business data analytics. It can also be used in colleges and universities offering courses and certifications in business data analytics, data science, and market research.
Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.
Various general techniques have been developed for control and systems problems, many of which involve indirect methods. Because these indirect methods are not always effective, alternative approaches using direct methods are of particular interest and relevance given the advances of computing in recent years.The focus of this book, unique in the literature, is on direct methods, which are concerned with finding actual solutions to problems in control and systems, often algorithmic in nature. Throughout the work, deterministic and stochastic problems are examined from a unified perspective and with considerable rigor. Emphasis is placed on the theoretical basis of the methods and their potential utility in a broad range of control and systems problems.The book is an excellent reference for graduate students, researchers, applied mathematicians, and control engineers and may be used as a textbook for a graduate course or seminar on direct methods in control.
Aside from distribution theory, projections and the singular value decomposition (SVD) are the two most important concepts for understanding the basic mechanism of multivariate analysis. The former underlies the least squares estimation in regression analysis, which is essentially a projection of one subspace onto another, and the latter underlies principal component analysis, which seeks to find a subspace that captures the largest variability in the original space. This book is about projections and SVD. A thorough discussion of generalized inverse (g-inverse) matrices is also given because it is closely related to the former. The book provides systematic and in-depth accounts of these concepts from a unified viewpoint of linear transformations finite dimensional vector spaces. More specially, it shows that projection matrices (projectors) and g-inverse matrices can be defined in various ways so that a vector space is decomposed into a direct-sum of (disjoint) subspaces. Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition will be useful for researchers, practitioners, and students in applied mathematics, statistics, engineering, behaviormetrics, and other fields.
In this thesis, the author develops numerical techniques for tracking and characterising the convoluted nodal lines in three-dimensional space, analysing their geometry on the small scale, as well as their global fractality and topological complexity---including knotting---on the large scale. The work is highly visual, and illustrated with many beautiful diagrams revealing this unanticipated aspect of the physics of waves. Linear superpositions of waves create interference patterns, which means in some places they strengthen one another, while in others they completely cancel each other out. This latter phenomenon occurs on 'vortex lines' in three dimensions. In general wave superpositions modelling e.g. chaotic cavity modes, these vortex lines form dense tangles that have never been visualised on the large scale before, and cannot be analysed mathematically by any known techniques.
This book offers a practical guide to Agent Based economic modeling, adopting a "learning by doing" approach to help the reader master the fundamental tools needed to create and analyze Agent Based models. After providing them with a basic "toolkit" for Agent Based modeling, it present and discusses didactic models of real financial and economic systems in detail. While stressing the main features and advantages of the bottom-up perspective inherent to this approach, the book also highlights the logic and practical steps that characterize the model building procedure. A detailed description of the underlying codes, developed using R and C, is also provided. In addition, each didactic model is accompanied by exercises and applications designed to promote active learning on the part of the reader. Following the same approach, the book also presents several complementary tools required for the analysis and validation of the models, such as sensitivity experiments, calibration exercises, economic network and statistical distributions analysis. By the end of the book, the reader will have gained a deeper understanding of the Agent Based methodology and be prepared to use the fundamental techniques required to start developing their own economic models. Accordingly, "Economics with Heterogeneous Interacting Agents" will be of particular interest to graduate and postgraduate students, as well as to academic institutions and lecturers interested in including an overview of the AB approach to economic modeling in their courses.
This book presents selected peer-reviewed contributions from the International Work-Conference on Time Series, ITISE 2017, held in Granada, Spain, September 18-20, 2017. It discusses topics in time series analysis and forecasting, including advanced mathematical methodology, computational intelligence methods for time series, dimensionality reduction and similarity measures, econometric models, energy time series forecasting, forecasting in real problems, online learning in time series as well as high-dimensional and complex/big data time series. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
Hereditary systems (or systems with either delay or after-effects)
are widely used to model processes in physics, mechanics, control,
economics and biology. An important element in their study is their
stability. Stability conditions for difference equations with delay
can be obtained using a Lyapunov functional.
Modern Actuarial Risk Theory contains what every actuary needs to know about non-life insurance mathematics. It starts with the standard material like utility theory, individual and collective model and basic ruin theory. Other topics are risk measures and premium principles, bonus-malus systems, ordering of risks and credibility theory. It also contains some chapters about Generalized Linear Models, applied to rating and IBNR problems. As to the level of the mathematics, the book would fit in a bachelors or masters program in quantitative economics or mathematical statistics. This second and much expanded edition emphasizes the implementation of these techniques through the use of R. This free but incredibly powerful software is rapidly developing into the de facto standard for statistical computation, not just in academic circles but also in practice. With R, one can do simulations, find maximum likelihood estimators, compute distributions by inverting transforms, and much more.
This book is a useful overview of results in multivariate probability distributions and multivariate analysis as well as a reference to harmonic analysis on symmetric cones adapted to the needs of researchers in analysis and probability theory.
This book is a tribute to Professor Pedro Gil, who created the Department of Statistics, OR and TM at the University of Oviedo, and a former President of the Spanish Society of Statistics and OR (SEIO). In more than eighty original contributions, it illustrates the extent to which Mathematics can help manage uncertainty, a factor that is inherent to real life. Today it goes without saying that, in order to model experiments and systems and to analyze related outcomes and data, it is necessary to consider formal ideas and develop scientific approaches and techniques for dealing with uncertainty. Mathematics is crucial in this endeavor, as this book demonstrates. As Professor Pedro Gil highlighted twenty years ago, there are several well-known mathematical branches for this purpose, including Mathematics of chance (Probability and Statistics), Mathematics of communication (Information Theory), and Mathematics of imprecision (Fuzzy Sets Theory and others). These branches often intertwine, since different sources of uncertainty can coexist, and they are not exhaustive. While most of the papers presented here address the three aforementioned fields, some hail from other Mathematical disciplines such as Operations Research; others, in turn, put the spotlight on real-world studies and applications. The intended audience of this book is mainly statisticians, mathematicians and computer scientists, but practitioners in these areas will certainly also find the book a very interesting read.
This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available
software hassimplified the use ofBayesian and hierarchicalmodels .
One obstacle remains for ecologists and wildlife biologists, namely
the near absence of Bayesian texts written specifically for them.
The book includes many relevant examples, is supported by software
and examples on a companion website and will become an essential
grounding in this approachforstudents and research
ecologists.
Optimization Theory is an active area of research with numerous applications; many of the books are designed for engineering classes, and thus have an emphasis on problems from such fields. Covering much of the same material, there is less emphasis on coding and detailed applications as the intended audience is more mathematical. There are still several important problems discussed (especially scheduling problems), but there is more emphasis on theory and less on the nuts and bolts of coding. A constant theme of the text is the "why" and the "how" in the subject. Why are we able to do a calculation efficiently? How should we look at a problem? Extensive effort is made to motivate the mathematics and isolate how one can apply ideas/perspectives to a variety of problems. As many of the key algorithms in the subject require too much time or detail to analyze in a first course (such as the run-time of the Simplex Algorithm), there are numerous comparisons to simpler algorithms which students have either seen or can quickly learn (such as the Euclidean algorithm) to motivate the type of results on run-time savings. |
![]() ![]() You may like...
Linear and Multiobjective Programming…
Masatoshi Sakawa, Hitoshi Yano, …
Hardcover
|