Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics > Probability & statistics
Probability Theory, Theory of Random Processes and Mathematical Statistics are important areas of modern mathematics and its applications. They develop rigorous models for a proper treatment for various 'random' phenomena which we encounter in the real world. They provide us with numerous tools for an analysis, prediction and, ultimately, control of random phenomena. Statistics itself helps with choice of a proper mathematical model (e.g., by estimation of unknown parameters) on the basis of statistical data collected by observations. This volume is intended to be a concise textbook for a graduate level course, with carefully selected topics representing the most important areas of modern Probability, Random Processes and Statistics. The first part (Ch. 1-3) can serve as a self-contained, elementary introduction to Probability, Random Processes and Statistics. It contains a number of relatively sim ple and typical examples of random phenomena which allow a natural introduction of general structures and methods. Only knowledge of elements of real/complex analysis, linear algebra and ordinary differential equations is required here. The second part (Ch. 4-6) provides a foundation of Stochastic Analysis, gives information on basic models of random processes and tools to study them. Here a familiarity with elements of functional analysis is necessary. Our intention to make this course fast-moving made it necessary to present important material in a form of examples."
On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncertainty from the available data.
The linear mixed model has become the main parametric tool for the analysis of continuous longitudinal data, as the authors discussed in their 2000 book. Without putting too much emphasis on software, the book shows how the different approaches can be implemented within the SAS software package. The authors received the American Statistical Association's Excellence in Continuing Education Award based on short courses on longitudinal and incomplete data at the Joint Statistical Meetings of 2002 and 2004.
Many of the concepts and terminology surrounding modern causal inference can be quite intimidating to the novice. Judea Pearl presents a book ideal for beginners in statistics, providing a comprehensive introduction to the field of causality. Examples from classical statistics are presented throughout to demonstrate the need for causality in resolving decision-making dilemmas posed by data. Causal methods are also compared to traditional statistical methods, whilst questions are provided at the end of each section to aid student learning.
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why one should consider diving into more detailed treatments of computer-modeling and latent-variable techniques, such as non-recursive path analysis, confirmatory factor analysis, and hierarchical linear modeling. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis.
The dynamics of population systems cannot be understood within the
framework of ordinary differential equations, which assume that the
number of interacting agents is infinite. With recent advances in
ecology, biochemistry and genetics it is becoming increasingly
clear that real systems are in fact subject to a great deal of
noise. Relevant examples include social insects competing for
resources, molecules undergoing chemical reactions in a cell and a
pool of genomes subject to evolution.When the population size is
small, novel macroscopic phenomena can arise, which can be analyzed
using the theory of stochastic processes. This thesis is centered
on two unsolved problems in population dynamics: the symmetry
breaking observed in foraging populations and the robustness of
spatial patterns. We argue that these problems can be resolved with
the help of two novel concepts: noise-induced bistable states and
stochastic patterns.
This book provides an introduction to dynamical systems with multiple time scales. The approach it takes is to provide an overview of key areas, particularly topics that are less available in the introductory form. The broad range of topics included makes it accessible for students and researchers new to the field to gain a quick and thorough overview. The first of its kind, this book merges a wide variety of different mathematical techniques into a more unified framework. The book is highly illustrated with many examples and exercises and an extensive bibliography. The target audience of this book are senior undergraduates, graduate students as well as researchers interested in using the multiple time scale dynamics theory in nonlinear science, either from a theoretical or a mathematical modeling perspective.
Algorithmic Principles of Mathematical Programming investigates the
mathematical structures and principles underlying the design of
efficient algorithms for optimization problems. Recent advances in
algorithmic theory have shown that the traditionally separate areas
of discrete optimization, linear programming, and nonlinear
optimization are closely linked. This book offers a comprehensive
introduction to the whole subject and leads the reader to the
frontiers of current research. The prerequisites to use the book
are very elementary. All the tools from numerical linear algebra
and calculus are fully reviewed and developed. Rather than
attempting to be encyclopedic, the book illustrates the important
basic techniques with typical problems. The focus is on efficient
algorithms with respect to practical usefulness. Algorithmic
complexity theory is presented with the goal of helping the reader
understand the concepts without having to become a theoretical
specialist. Further theory is outlined and supplemented with
pointers to the relevant literature.
Advances in Stochastic Modelling and Data Analysis presents the most recent developments in the field, together with their applications, mainly in the areas of insurance, finance, forecasting and marketing. In addition, the possible interactions between data analysis, artificial intelligence, decision support systems and multicriteria analysis are examined by top researchers. Audience: A wide readership drawn from theoretical and applied mathematicians, such as operations researchers, management scientists, statisticians, computer scientists, bankers, marketing managers, forecasters, and scientific societies such as EURO and TIMS.
Stochastic processes are mathematical models of random phenomena that evolve according to prescribed dynamics. Processes commonly used in applications are Markov chains in discrete and continuous time, renewal and regenerative processes, Poisson processes, and Brownian motion. This volume gives an in-depth description of the structure and basic properties of these stochastic processes. A main focus is on equilibrium distributions, strong laws of large numbers, and ordinary and functional central limit theorems for cost and performance parameters. Although these results differ for various processes, they have a common trait of being limit theorems for processes with regenerative increments. Extensive examples and exercises show how to formulate stochastic models of systems as functions of a system s data and dynamics, and how to represent and analyze cost and performance measures. Topics include stochastic networks, spatial and space-time Poisson processes, queueing, reversible processes, simulation, Brownian approximations, and varied Markovian models. The technical level of the volume is between that of introductory texts that focus on highlights of applied stochastic processes, and advanced texts that focus on theoretical aspects of processes."
This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.
The focus of this book is on bilevel programming which combines elements of hierarchical optimization and game theory. The basic model addresses the problem where two decision-makers, each with their individual objectives, act and react in a noncooperative manner. The actions of one affect the choices and payoffs available to the other but neither player can completely dominate the other in the traditional sense. Over the last 20 years there has been a steady growth in research related to theory and solution methodologies for bilevel programming. This interest stems from the inherent complexity and consequent challenge of the underlying mathematics, as well as the applicability of the bilevel model to many real-world situations. The primary aim of this book is to provide a historical perspective on algorithmic development and to highlight those implementations that have proved to be the most efficient in their class. A corollary aim is to provide a sampling of applications in order to demonstrate the versatility of the basic model and the limitations of current technology. What is unique about this book is its comprehensive and integrated treatment of theory, algorithms and implementation issues. It is the first text that offers researchers and practitioners an elementary understanding of how to solve bilevel programs and a perspective on what success has been achieved in the field. Audience: Includes management scientists, operations researchers, industrial engineers, mathematicians and economists.
It appears that we live in an age of disasters: the mighty Missis sippi and Missouri flood millions of acres, earthquakes hit Tokyo and California, airplanes crash due to mechanical failure and the seemingly ever increasing wind speeds make the storms more and more frightening. While all these may seem to be unexpected phenomena to the man on the street, they are actually happening according to well defined rules of science known as extreme value theory. We know that records must be broken in the future, so if a flood design is based on the worst case of the past then we are not really prepared against floods. Materials will fail due to fatigue, so if the body of an aircraft looks fine to the naked eye, it might still suddenly fail if the aircraft has been in operation over an extended period of time. Our theory has by now penetrated the so cial sciences, the medical profession, economics and even astronomy. We believe that our field has come of age. In or er to fully utilize the great progress in the theory of extremes and its ever increasing acceptance in practice, an international conference was organized in which equal weight was given to theory and practice. This book is Volume I of the Proceedings of this conference. In selecting the papers for Volume lour guide was to have authoritative works with a large variety of coverage of both theory and practice."
The book is a collection of essays on various issues in philosophy of science, with special emphasis on the foundations of probability and statistics, and quantum mechanics. The main topics, addressed by some of the most outstanding researchers in the field, are subjective probability, Bayesian statistics, probability kinematics, causal decision making, probability and realism in quantum mechanics.
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. This book is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an appied regression course to graduate students. This book seves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to statistical methods and a thoeretical linear models course. This book emphasizes the concepts and the analysis of data sets. It provides a review of the key concepts in simple linear regression, matrix operations, and multiple regression. Methods and criteria for selecting regression variables and geometric interpretations are discussed. Polynomial, trigonometric, analysis of variance, nonlinear, time series, logistic, random effects, and mixed effects models are also discussed. Detailed case studies and exercises based on real data sets are used to reinforce the concepts. John O. Rawlings, Professor Emeritus in the Department of Statistics at North Carolina State University, retired after 34 years of teaching, consulting, and research in statistical methods. He was instrumental in developing, and for many years taught, the course on which this text is based. He is a Fellow of the American Statistical Association and the Crop Science Society of America. Sastry G. Pantula is Professor and Directory of Graduate Programs in the Department of Statistics at North Carolina State University. He is a member of the Academy of Outstanding Teachers at North Carolina State University. David A. Dickey is Professor of Statistics at North Carolina State University. He is a member of the Academy of Outstanding Teachers at North Carolina State University.
This prospective book discusses conceptual and pragmatic issues in the assessment of statistical knowledge and reasoning skills and the use of assessments to improve instruction among students at college and pre-college levels. It is designed primarily for academic audiences involved in teaching statistics and mathematics, and in teacher education and training. The book is divided in four sections: (I) Assessment goals and frameworks, (2) Assessing conceptual understanding of statistical ideas, (3) Innovative models for classroom assessments, and (4) Assessing understanding of probability. Both editors are involved in assessment issues in statistics. The book is written by leading researchers, statistics math educators and curriculum developers.
*E-statistics provides powerful methods to deal with problems in multivariate inference and analysis *Methods are implemented in R, and readers can immediately apply them using the freely available energy package for R *The proposed book will provide an overview of the existing state-of-the-art in development of energy statistics and an overview of applications. *Background and literature review is valuable for anyone considering further research or application in energy statistics.
A treatment of estimating unknown parameters, testing hypotheses and estimating confidence intervals in linear models. Readers will find here presentations of the Gauss-Markoff model, the analysis of variance, the multivariate model, the model with unknown variance and covariance components and the regression model as well as the mixed model for estimating random parameters. A chapter on the robust estimation of parameters and several examples have been added to this second edition. The necessary theorems of vector and matrix algebra and the probability distributions of test statistics are derived so as to make this book self-contained. Geodesy students as well as those in the natural sciences and engineering will find the emphasis on the geodetic application of statistical models extremely useful.
This book is an introduction to optimal stochastic control for continuous time Markov processes and the theory of viscosity solutions. It covers dynamic programming for deterministic optimal control problems, as well as to the corresponding theory of viscosity solutions. New chapters in this second edition introduce the role of stochastic optimal control in portfolio optimization and in pricing derivatives in incomplete markets and two-controller, zero-sum differential games.
Illustrate the usefulness and abuse of statistics in automotive safety arguments that are hidden from public view Show the importance of data and statistics in making safety-related changes that ultimately save or cost lives Discuss the impact of electric vehicles and autonomous vehicles on car safety
The year 2000 is the centenary year of the publication of Bachelier's thesis which - together with Harry Markovitz Ph.D. dissertation on portfolio selection in 1952 and Fischer Black's and Myron Scholes' solution of an option pricing problem in 1973 - is considered as the starting point of modern finance as a mathematical discipline. On this remarkable anniversary the workshop on mathematical finance held at the University of Konstanz brought together practitioners, economists and mathematicians to discuss the state of the art. Apart from contributions to the known discrete, Brownian, and LA(c)vy process models, first attempts to describe a market in a reasonable way by a fractional Brownian motion model are presented, opening many new aspects for practitioners and new problems for mathematicians. As most dynamical financial problems are stochastic filtering or control problems many talks presented adaptations of control methods and techniques to the classical financial problems in a [ portfolio selection a [ irreversible investment a [ risk sensitive asset allocation a [ capital asset pricing a [ hedging contingent claims a [ option pricing a [ interest rate theory. The contributions of practitioners link the theoretical results to the steadily increasing flow of real world problems from financial institutions into mathematical laboratories. The present volume reflects this exchange of theoretical and applied results, methods and techniques that made the workshop a fruitful contribution to the interdisciplinary work in mathematical finance.
The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extremen value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.
This text contains 300 problems in mathematical statistics, together with detailed solutions.
This volume is devoted to the development of an algebraic model of databases. The first chapter presents a general introduction. The following sixteen chapters are divided into three main parts. Part I deals with various aspects of universal algebra. The chapters of Part I discuss topics such as sets, algebras and models, fundamental structures, categories, the category of sets, topoi, fuzzy sets, varieties of algebras, axiomatic classes, category algebra and algebraic theories. Part II deals with different approaches to the algebraization of predicate calculus. This material is intended to be applied chiefly to databases, although some discussion of pure algebraic applications is also given. Discussed here are topics such as Boolean algebras and propositional calculus, Halmos algebras and predicate calculus, connections with model theory, and the categorial approach to algebraic logic. Part III is concerned specifically with the algebraic model of databases, which considers the database as an algebraic structure. Topics dealt with in this part are the algebraic aspects of databases, their equivalence and restructuring, symmetries and the Galois theory of databases, and constructions in database theory. The volume closes with a discussion and conclusions, and an extensive bibliography. For mathematicians, computer scientists and database engineers, with an interest in applications of algebra and logic. |
You may like...
Statistics For Business And Economics
David Anderson, James Cochran, …
Paperback
(1)
R2,342 Discovery Miles 23 420
Fatal Numbers: Why Count on Chance
Hans Magnus Enzensberger
Paperback
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu
Paperback
Mathematical Statistics with…
William Mendenhall, Dennis Wackerly, …
Paperback
Statistical Methods and Calculation…
Isabel Willemse, Peter Nyelisani
Paperback
|