![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
The Structural Theory of Probability addresses the interpretation of probability, often debated in the scientific community. This problem has been examined for centuries; perhaps no other mathematical calculation suffuses mankind's efforts at survival as amply as probability. In the dawn of the 20th century David Hilbert included the foundations of the probability calculus within the most vital mathematical problems; Dr. Rocchi's topical and ever-timely volume proposes a novel, exhaustive solution to this vibrant issue. Paolo Rocchi, a versatile IBM scientist, outlines a new philosophical and mathematical approach inspired by well-tested software techniques. Through the prism of computer technology he provides an innovative view on the theory of probability. Dr. Rocchi discusses in detail the mathematical tools used to clarify the meaning of probability, integrating with care numerous examples and case studies. The comprehensiveness and originality of its mathematical development make this volume an inspiring read for researchers and students alike. From a review by the Mathematical Association of America Online: "[The author's] basis thesis is this: Probability theory from Pascal to Kolmogorov and onwards has focused on events as sets of outcomes or results, and probability as a measure attached to these sets. But this ignores the structure of the processes which lead to the outcomes, and the author explores how taking into account the details of the processes would lead to a more fundamental understanding of the nature of probability. This is an interesting idea, and the author makes it clear that at present this is a work in process and not yet a finished product, for hesays that he has tried to give "an impulse in the right direction" with his theory. ... One hopes that in due course the author will develop his theories further and present overwhelmingly persuasive examples of the advantages of his approach." - Ramachandran Bharath
Written by one of the masters of the foundation of measurement,
Louis Narens' new book thoroughly examines the basis for the
measurement-theoretic concept of meaningfulness and presents a new
theory about the role of numbers and invariance in science. The
book associates with each portion of mathematical science a subject
matter that the portion of science is intended to investigate or
describe. It considers those quantitative or empirical assertions
and relationships that belong to the subject matter to be
meaningful (for that portion of science) and those that do not
belong to be meaningless.
This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the "fear factor" usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.
Classical statistical techniques fail to cope well with deviations from a standard distribution. Robust statistical methods take into account these deviations while estimating the parameters of parametric models, thus increasing the accuracy of the inference. Research into robust methods is flourishing, with new methods being developed and different applications considered. "Robust Statistics" sets out to explain the use of robust methods and their theoretical justification. It provides an up-to-date overview of the theory and practical application of the robust statistical methods in regression, multivariate analysis, generalized linear models and time series. This unique book: Enables the reader to select and use the most appropriate robust method for their particular statistical model. Features computational algorithms for the core methods. Covers regression methods for data mining applications. Includes examples with real data and applications using the S-Plus robust statistics library. Describes the theoretical and operational aspects of robust methods separately, so the reader can choose to focus on one or the other. Supported by a supplementary website featuring time-limited S-Plus download, along with datasets and S-Plus code to allow the reader to reproduce the examples given in the book. "Robust Statistics" aims to stimulate the use of robust methods as a powerful tool to increase the reliability and accuracy of statistical modelling and data analysis. It is ideal for researchers, practitioners and graduate students of statistics, electrical, chemical and biochemical engineering, and computer vision. There is also much to benefit researchers from other sciences, suchas biotechnology, who need to use robust statistical methods in their work.
This book deals with the almost sure asymptotic behaviour of linearly transformed sequences of independent random variables, vectors and elements of topological vector spaces. The main subjects dealing with series of independent random elements on topological vector spaces, and in particular, in sequence spaces, as well as with generalized summability methods which are treated here are strong limit theorems for operator-normed (matrix normed) sums of independent finite-dimensional random vectors and their applications; almost sure asymptotic behaviour of realizations of one-dimensional and multi-dimensional Gaussian Markov sequences; various conditions providing almost sure continuity of sample paths of Gaussian Markov processes; and almost sure asymptotic behaviour of solutions of one-dimensional and multi-dimensional stochastic recurrence equations of special interest. Many topics, especially those related to strong limit theorems for operator-normed sums of independent random vectors, appear in monographic literature for the first time. Audience: The book is aimed at experts in probability theory, theory of random processes and mathematical statistics who are interested in the almost sure asymptotic behaviour in summability schemes, like operator normed sums and weighted sums, etc. Numerous sections will be of use to those who work in Gaussian processes, stochastic recurrence equations, and probability theory in topological vector spaces. As the exposition of the material is consistent and self-contained it can also be recommended as a textbook for university courses.
Hardbound. This volume of the Handbook is concerned particularly with the frequency side, or spectrum, approach to time series analysis. This approach involves essential use of sinusoids and bands of (angular) frequency, with Fourier transforms playing an important role. A principal activity is thinking of systems, their inputs, outputs, and behavior in sinusoidal terms. In many cases, the frequency side approach turns out to be simpler with respect to computational, mathematical, and statistical aspects. In the frequency approach, an assumption of stationarity is commonly made. However, the essential roles played by the techniques of complex demodulation and seasonal adjustment show that stationarity is far from being a necessary condition. Assumptions of Gaussianity and linearity are also commonly made and yet, as a variety of the papers illustrate, these assumptions are not necessary. This volume complements Handbook of Statistics 5: Time Series in the
The food market is changing from a producer-controlled to a
consumer-directed market. A main driving force is consumer concern
about agricultural production methods and food safety. More than
before, the consumer demands transparency of the production and
processing chain.
Accessible to users with relatively little experience with R programming Reproducible data analysis examples that can be modified to accommodate users' own data Accompanying e-book website with links to additional resources and R code updates as needed Features dichotomous and polytomous (rating scale) Rasch models that can be applied to data from a wide range of disciplines
This book covers numerical methods for stochastic partial differential equations with white noise using the framework of Wong-Zakai approximation. The book begins with some motivational and background material in the introductory chapters and is divided into three parts. Part I covers numerical stochastic ordinary differential equations. Here the authors start with numerical methods for SDEs with delay using the Wong-Zakai approximation and finite difference in time. Part II covers temporal white noise. Here the authors consider SPDEs as PDEs driven by white noise, where discretization of white noise (Brownian motion) leads to PDEs with smooth noise, which can then be treated by numerical methods for PDEs. In this part, recursive algorithms based on Wiener chaos expansion and stochastic collocation methods are presented for linear stochastic advection-diffusion-reaction equations. In addition, stochastic Euler equations are exploited as an application of stochastic collocation methods, where a numerical comparison with other integration methods in random space is made. Part III covers spatial white noise. Here the authors discuss numerical methods for nonlinear elliptic equations as well as other equations with additive noise. Numerical methods for SPDEs with multiplicative noise are also discussed using the Wiener chaos expansion method. In addition, some SPDEs driven by non-Gaussian white noise are discussed and some model reduction methods (based on Wick-Malliavin calculus) are presented for generalized polynomial chaos expansion methods. Powerful techniques are provided for solving stochastic partial differential equations. This book can be considered as self-contained. Necessary background knowledge is presented in the appendices. Basic knowledge of probability theory and stochastic calculus is presented in Appendix A. In Appendix B some semi-analytical methods for SPDEs are presented. In Appendix C an introduction to Gauss quadrature is provided. In Appendix D, all the conclusions which are needed for proofs are presented, and in Appendix E a method to compute the convergence rate empirically is included. In addition, the authors provide a thorough review of the topics, both theoretical and computational exercises in the book with practical discussion of the effectiveness of the methods. Supporting Matlab files are made available to help illustrate some of the concepts further. Bibliographic notes are included at the end of each chapter. This book serves as a reference for graduate students and researchers in the mathematical sciences who would like to understand state-of-the-art numerical methods for stochastic partial differential equations with white noise.
The present book is based on a course developed as partofthe large NSF-funded GatewayCoalitionInitiativeinEngineeringEducationwhichincludedCaseWest ern Reserve University, Columbia University, Cooper Union, Drexel University, Florida International University, New Jersey Institute ofTechnology, Ohio State University, University ofPennsylvania, Polytechnic University, and Universityof South Carolina. The Coalition aimed to restructure the engineering curriculum by incorporating the latest technological innovations and tried to attract more and betterstudents to engineering and science. Draftsofthis textbookhave been used since 1992instatisticscoursestaughtatCWRU, IndianaUniversity, Bloomington, and at the universities in Gottingen, Germany, and Grenoble, France. Another purpose of this project was to develop a courseware that would take advantage ofthe Electronic Learning Environment created by CWRUnet-the all fiber-optic Case Western Reserve University computer network, and its ability to let students run Mathematica experiments and projects in their dormitory rooms, and interactpaperlessly with the instructor. Theoretically, onecould try togothroughthisbook withoutdoing Mathematica experimentsonthecomputer, butitwouldbelikeplayingChopin's Piano Concerto in E-minor, or Pink Floyd's The Wall, on an accordion. One would get an idea ofwhatthe tune was without everexperiencing the full richness andpowerofthe entire composition, and the whole ambience would be miscued."
Missing data have long plagued those conducting applied research in the social, behavioral, and health sciences. Good missing data analysis solutions are available, but practical information about implementation of these solutions has been lacking. The objective of "Missing Data: Analysis and Design" is to enable investigators who are non-statisticians to implement modern missing data procedures properly in their research, and reap the benefits in terms of improved accuracy and statistical power. "Missing Data: Analysis and Design" contains essential information for both beginners and advanced readers. For researchers with limited missing data analysis experience, this book offers an easy-to-read introduction to the theoretical underpinnings of analysis of missing data; provides clear, step-by-step instructions for performing state-of-the-art multiple imputation analyses; and offers practical advice, based on over 20 years' experience, for avoiding and troubleshooting problems. For more advanced readers, unique discussions of attrition, non-Monte-Carlo techniques for simulations involving missing data, evaluation of the benefits of auxiliary variables, and highly cost-effective planned missing data designs are provided. The author lays out missing data theory in a plain English style that is accessible and precise. Most analysis described in the book are conducted using the well-known statistical software packages SAS and SPSS, supplemented by Norm 2.03 and associated Java-based automation utilities. A related web site contains free downloads of the supplementary software, as well as sample empirical data sets and a variety of practical exercises described in the book to enhance and reinforce the reader s learning experience. "Missing Data: Analysis and Design" and its web site work together to enable beginners to gain confidence in their ability to conduct missing data analysis, and more advanced readers to expand their skill set. "
This book explores different statistical quality technologies including recent advances and applications. Statistical process control, acceptance sample plans and reliability assessment are some of the essential statistical techniques in quality technologies to ensure high quality products and to reduce consumer and producer risks. Numerous statistical techniques and methodologies for quality control and improvement have been developed in recent years to help resolve current product quality issues in today's fast changing environment. Featuring contributions from top experts in the field, this book covers three major topics: statistical process control, acceptance sampling plans, and reliability testing and designs. The topics covered in the book are timely and have a high potential impact and influence to academics, scholars, students and professionals in statistics, engineering, manufacturing and health.
The Introduction to Bayesian Statistics (2nd Edition) presents Bayes theorem, the estimation of unknown parameters, the determination of confidence regions and the derivation of tests of hypotheses for the unknown parameters, in a manner that is simple, intuitive and easy to comprehend. The methods are applied to linear models, in models for a robust estimation, for prediction and filtering and in models for estimating variance components and covariance components. Regularization of inverse problems and pattern recognition are also covered while Bayesian networks serve for reaching decisions in systems with uncertainties. If analytical solutions cannot be derived, numerical algorithms are presented such as the Monte Carlo integration and Markov Chain Monte Carlo methods."
The subject of these two volumes is non-linear filtering (prediction and smoothing) theory and its application to the problem of optimal estimation, control with incomplete data, information theory, and sequential testing of hypothesis. The book is not only addressed to mathematicians but should also serve the interests of other scientists who apply probabilistic and statistical methods in their work. The theory of martingales presented in the book has an independent interest in connection with problems from financial mathematics. In the second edition, the authors have made numerous corrections, updating every chapter, adding two new subsections devoted to the Kalman filter under wrong initial conditions, as well as a new chapter devoted to asymptotically optimal filtering under diffusion approximation. Moreover, in each chapter a comment is added about the progress of recent years.
'Et moi, "'J si j'avait su comment en revcnir, One seMcc mathematics has rendered the je n'y semis point aile.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shclf next to the dusty canister labelled 'discarded non sense'. The series is divergent; therefore we may be able to do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
The book introduces basic risk concepts and then goes on to discuss risk management and analysis processes and steps. The main emphasis is on methods that fulfill the requirements of one or several risk management steps. The focus is on risk analysis methods including statistical-empirical analyses, probabilistic and parametrized models, engineering approaches and simulative methods, e.g. for fragment and blast propagation or hazard density computation. Risk management is essential for improving all resilience management steps: preparation, prevention, protection, response and recovery. The methods investigate types of event and scenario, as well as frequency, exposure, avoidance, hazard propagation, damage and risks of events. Further methods are presented for context assessment, risk visualization, communication, comparison and assessment as well as selecting mitigation measures. The processes and methods are demonstrated using detailed results and overviews of security research projects, in particular in the applications domains transport, aviation, airport security, explosive threats and urban security and safety. Topics include: sufficient control of emerging and novel hazards and risks, occupational safety, identification of minimum (functional) safety requirements, engineering methods for countering malevolent or terrorist events, security research challenges, interdisciplinary approaches to risk control and management, risk-based change and improvement management, and support of rational decision-making. The book addresses advanced bachelor students, master and doctoral students as well as scientists, researchers and developers in academia, industry, small and medium enterprises working in the emerging field of security and safety engineering.
This volume considers various methods for constructing cubature and quadrature formulas of arbitrary degree. These formulas are intended to approximate the calculation of multiple and conventional integrals over a bounded domain of integration. The latter is assumed to have a piecewise-smooth boundary and to be arbitrary in other aspects. Particular emphasis is placed on invariant cubature formulas and those for a cube, a simplex, and other polyhedra. Here, the techniques of functional analysis and partial differential equations are applied to the classical problem of numerical integration, to establish many important and deep analytical properties of cubature formulas. The prerequisites of the theory of many-dimensional discrete function spaces and the theory of finite differences are concisely presented. Special attention is paid to constructing and studying the optimal cubature formulas in Sobolev spaces. As an asymptotically optimal sequence of cubature formulas, a many-dimensional abstraction of the Gregory quadrature is indicated. Audience: This book is intended for researchers having a basic knowledge of functional analysis who are interested in the applications of modern theoretical methods to numerical mathematics.
This volume consists of papers inspired by the special session on pseudo-differential operators at the 10th ISAAC Congress held at the University of Macau, August 3-8, 2015 and the mini-symposium on pseudo-differential operators in industries and technologies at the 8th ICIAM held at the National Convention Center in Beijing, August 10-14, 2015. The twelve papers included present cutting-edge trends in pseudo-differential operators and applications from the perspectives of Lie groups (Chapters 1-2), geometry (Chapters 3-5) and applications (Chapters 6-12). Many contributions cover applications in probability, differential equations and time-frequency analysis. A focus on the synergies of pseudo-differential operators with applications, especially real-life applications, enhances understanding of the analysis and the usefulness of these operators.
The mathematical and statistical tools needed in the rapidly growing quantitative finance field With the rapid growth in quantitative finance, practitioners must achieve a high level of proficiency in math and statistics. Mathematical Methods and Statistical Tools for Finance, part of the Frank J. Fabozzi Series, has been created with this in mind. Designed to provide the tools needed to apply finance theory to real world financial markets, this book offers a wealth of insights and guidance in practical applications. It contains applications that are broader in scope from what is covered in a typical book on mathematical techniques. Most books focus almost exclusively on derivatives pricing, the applications in this book cover not only derivatives and asset pricing but also risk management including credit risk management and portfolio management. * Includes an overview of the essential math and statistical skills required to succeed in quantitative finance * Offers the basic mathematical concepts that apply to the field of quantitative finance, from sets and distances to functions and variables * The book also includes information on calculus, matrix algebra, differential equations, stochastic integrals, and much more * Written by Sergio Focardi, one of the world's leading authors in high-level finance Drawing on the author's perspectives as a practitioner and academic, each chapter of this book offers a solid foundation in the mathematical tools and techniques need to succeed in today's dynamic world of finance.
This book offers a comprehensive reference guide to fuzzy statistics and fuzzy decision-making techniques. It provides readers with all the necessary tools for making statistical inference in the case of incomplete information or insufficient data, where classical statistics cannot be applied. The respective chapters, written by prominent researchers, explain a wealth of both basic and advanced concepts including: fuzzy probability distributions, fuzzy frequency distributions, fuzzy Bayesian inference, fuzzy mean, mode and median, fuzzy dispersion, fuzzy p-value, and many others. To foster a better understanding, all the chapters include relevant numerical examples or case studies. Taken together, they form an excellent reference guide for researchers, lecturers and postgraduate students pursuing research on fuzzy statistics. Moreover, by extending all the main aspects of classical statistical decision-making to its fuzzy counterpart, the book presents a dynamic snapshot of the field that is expected to stimulate new directions, ideas and developments.
A clear, comprehensive treatment of the subject, Environmental Statistics with S-PLUS is an ideal resource for environmental scientists, engineers, regulators, and students, even those with only a limited knowledge of statistics. It provides insight into what to think about before you collect environmental data, how to collect it, and how to make sense of it after you have it. This book addresses the vast array of methods used today by scientists, researchers, and regulators. |
You may like...
Fat Chance - Probability from 0 to 1
Benedict Gross, Joe Harris, …
Hardcover
R2,008
Discovery Miles 20 080
Integrated Population Biology and…
Arni S.R. Srinivasa Rao, C.R. Rao
Hardcover
R6,219
Discovery Miles 62 190
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu
Paperback
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
Data Analysis and Data Mining - An…
Adelchi Azzalini, Bruno Scarpa
Hardcover
R3,280
Discovery Miles 32 800
Advances in Quantum Monte Carlo
Shigenori Tanaka, Stuart M. Rothstein, …
Hardcover
R5,469
Discovery Miles 54 690
|