![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
Because of its potential to "predict the unpredictable," Extreme Value Theory (EVT) and its methodology are currently in the spotlight. EVT affords some insight into extreme tails and maxima where standard models have proved unreliable. This is achieved with semi-parametric models which only specify the distributional shapes of maxima or of extreme tails. The rationale for these models are very basic limit and stability arguments.
This book presents recent developments in multivariate and robust statistical methods. Featuring contributions by leading experts in the field it covers various topics, including multivariate and high-dimensional methods, time series, graphical models, robust estimation, supervised learning and normal extremes. It will appeal to statistics and data science researchers, PhD students and practitioners who are interested in modern multivariate and robust statistics. The book is dedicated to David E. Tyler on the occasion of his pending retirement and also includes a review contribution on the popular Tyler’s shape matrix.
This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the "fear factor" usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.
Change-point problems arise in a variety of experimental and mathematical sciences, as well as in engineering and health sciences. This rigorously researched text provides a comprehensive review of recent probabilistic methods for detecting various types of possible changes in the distribution of chronologically ordered observations. Further developing the already well-established theory of weighted approximations and weak convergence, the authors provide a thorough survey of parametric and non-parametric methods, regression and time series models together with sequential methods. All but the most basic models are carefully developed with detailed proofs, and illustrated by using a number of data sets. Contains a thorough survey of:
This book explores different statistical quality technologies including recent advances and applications. Statistical process control, acceptance sample plans and reliability assessment are some of the essential statistical techniques in quality technologies to ensure high quality products and to reduce consumer and producer risks. Numerous statistical techniques and methodologies for quality control and improvement have been developed in recent years to help resolve current product quality issues in today's fast changing environment. Featuring contributions from top experts in the field, this book covers three major topics: statistical process control, acceptance sampling plans, and reliability testing and designs. The topics covered in the book are timely and have a high potential impact and influence to academics, scholars, students and professionals in statistics, engineering, manufacturing and health.
In recent years, the study of the theory of Brownian motion has
become a powerful tool in the solution of problems in mathematical
physics. This self-contained and readable exposition by leading
authors, provides a rigorous account of the subject, emphasizing
the "explicit" rather than the "concise" where necessary, and
addressed to readers interested in probability theory as applied to
analysis and mathematical physics.
This proceedings volume presents new methods and applications in Operational Research and Management Science with a special focus on Business Analytics. Featuring selected contributions from the XIV Balkan Conference on Operational Research held in Thessaloniki, Greece in 2020 (BALCOR 2020), it addresses applications and methodological tools or techniques in various areas of Operational Research, such as agent-based modelling, big data and business analytics, data envelopment analysis, data mining, decision support systems, fuzzy systems, game theory, heuristics, metaheuristics and nature inspired optimization algorithms, linear and nonlinear programming, machine learning, multiple criteria decision analysis, network design and optimization, queuing theory, simulation and statistics.
This book presents up-to-date mathematical results in asymptotic theory on nonlinear regression on the basis of various asymptotic expansions of least squares, its characteristics, and its distribution functions of functionals of Least Squares Estimator. It is divided into four chapters. In Chapter 1 assertions on the probability of large deviation of normal Least Squares Estimator of regression function parameters are made. Chapter 2 indicates conditions for Least Moduli Estimator asymptotic normality. An asymptotic expansion of Least Squares Estimator as well as its distribution function are obtained and two initial terms of these asymptotic expansions are calculated. Separately, the Berry-Esseen inequality for Least Squares Estimator distribution is deduced. In the third chapter asymptotic expansions related to functionals of Least Squares Estimator are dealt with. Lastly, Chapter 4 offers a comparison of the powers of statistical tests based on Least Squares Estimators. The Appendix gives an overview of subsidiary facts and a list of principal notations. Additional background information, grouped per chapter, is presented in the Commentary section. The volume concludes with an extensive Bibliography. Audience: This book will be of interest to mathematicians and statisticians whose work involves stochastic analysis, probability theory, mathematics of engineering, mathematical modelling, systems theory or cybernetics.
I became interested in Random Vibration during the preparation of my PhD dissertation, which was concerned with the seismic response of nuclear reactor cores. I was initiated into this field through the cla.ssical books by Y.K.Lin, S.H.Crandall and a few others. After the completion of my PhD, in 1981, my supervisor M.Gera.din encouraged me to prepare a course in Random Vibration for fourth and fifth year students in Aeronautics, at the University of Liege. There was at the time very little material available in French on that subject. A first draft was produced during 1983 and 1984 and revised in 1986. These notes were published by the Presses Poly techniques et Universitaires Romandes (Lausanne, Suisse) in 1990. When Kluwer decided to publish an English translation ofthe book in 1992, I had to choose between letting Kluwer translate the French text in-extenso or doing it myself, which would allow me to carry out a sustantial revision of the book. I took the second option and decided to rewrite or delete some of the original text and include new material, based on my personal experience, or reflecting recent technical advances. Chapter 6, devoted to the response of multi degree offreedom structures, has been completely rewritten, and Chapter 11 on random fatigue is entirely new. The computer programs which have been developed in parallel with these chapters have been incorporated in the general purpose finite element software SAMCEF, developed at the University of Liege.
This book aims to present the impact of Artificial Intelligence (AI) and Big Data in healthcare for medical decision making and data analysis in myriad fields including Radiology, Radiomics, Radiogenomics, Oncology, Pharmacology, COVID-19 prognosis, Cardiac imaging, Neuroradiology, Psychiatry and others. This will include topics such as Artificial Intelligence of Thing (AIOT), Explainable Artificial Intelligence (XAI), Distributed learning, Blockchain of Internet of Things (BIOT), Cybersecurity, and Internet of (Medical) Things (IoTs). Healthcare providers will learn how to leverage Big Data analytics and AI as methodology for accurate analysis based on their clinical data repositories and clinical decision support. The capacity to recognize patterns and transform large amounts of data into usable information for precision medicine assists healthcare professionals in achieving these objectives. Intelligent Health has the potential to monitor patients at risk with underlying conditions and track their progress during therapy. Some of the greatest challenges in using these technologies are based on legal and ethical concerns of using medical data and adequately representing and servicing disparate patient populations. One major potential benefit of this technology is to make health systems more sustainable and standardized. Privacy and data security, establishing protocols, appropriate governance, and improving technologies will be among the crucial priorities for Digital Transformation in Healthcare.
Aggregation plays a central role in many of the technological tasks we are faced with. The importance of this process will become even greater as we move more and more toward becoming an information-cent.ered society, us is happening with the rapid growth of the Internet and the World Wirle Weh. Here we shall be faced with many issues related to the fusion of information. One very pressing issue here is the development of mechanisms to help search for information, a problem that clearly has a strong aggregation-related component. More generally, in order to model the sophisticated ways in which human beings process information, as well as going beyond the human capa bilities, we need provide a basket of aggregation tools. The centrality of aggregation in human thought can be be very clearly seen by looking at neural networks, a technology motivated by modeling the human brain. One can see that the basic operations involved in these networks are learning and aggregation. The Ordered Weighted Averaging (OWA) operators provide a parameter ized family of aggregation operators which include many of the well-known operators such as the maximum, minimum and the simple average."
Algebraic statistics is a rapidly developing field, where ideas from statistics and algebra meet and stimulate new research directions. One of the origins of algebraic statistics is the work by Diaconis and Sturmfels in 1998 on the use of Grobner bases for constructing a connected Markov chain for performing conditional tests of a discrete exponential family.In this book we take up this topic and present a detailed summary of developments following the seminal work of Diaconis and Sturmfels. This book is intended for statisticians with minimal backgrounds in algebra.As we ourselves learned algebraic notions through working on statistical problems and collaborating with notable algebraists, we hope that this book with many practical statistical problems is useful for statisticians to start working on the field."
This book illustrates the current work of leading multilevel modeling (MLM) researchers from around the world. The book's goal is to critically examine the real problems that occur when trying to use MLMs in applied research, such as power, experimental design, and model violations. This presentation of cutting-edge work and statistical innovations in multilevel modeling includes topics such as growth modeling, repeated measures analysis, nonlinear modeling, outlier detection, and meta analysis. This volume will be beneficial for researchers with advanced statistical training and extensive experience in applying multilevel models, especially in the areas of education; clinical intervention; social, developmental and health psychology, and other behavioral sciences; or as a supplement for an introductory graduate-level course.
Contains a compact disc with nearly 200 microcomputer programs illustrating a wide range of reliability and statistical analyses Mechanical Reliability Improvement provides probability and statistical concepts developed using pseudorandom numbers enumeration-, simulation-, and randomization-based statistical analyses for comparison of the test performance of alternative designs, as well as simulation- and randomization-based tests for examination of the credibility of statistical presumptions and discusses centroid and moment of inertia analogies for mean and variance the organization structure of completely randomized, randomized complete block, and split spot experiment test programs
Providing researchers in economics, finance, and statistics with an up-to-date introduction to applying Bayesian techniques to empirical studies, this book covers the full range of the new numerical techniques which have been developed over the last thirty years. Notably, these are: Monte Carlo sampling, antithetic replication, importance sampling, and Gibbs sampling. The author covers both advances in theory and modern approaches to numerical and applied problems, and includes applications drawn from a variety of different fields within economics, while also providing a quick overview of the underlying statistical ideas of Bayesian thought. The result is a book which presents a roadmap of applied economic questions that can now be addressed empirically with Bayesian methods. Consequently, many researchers will find this a readily readable survey of this growing topic.
First published in 2002. Routledge is an imprint of Taylor & Francis, an informa company.
'Et moi, ..~ si lavait su CO.llUlJalt en revc:nir, One acMcc matbcmatica bu JaIdcred the human rac:c. It bu put COIDIDOD _ beet je n'y serais point aBe.' Jules Verne wbac it bdoup, 0Jl !be~ IbcII _t to !be dusty cauialcr Iabc&d 'diMardod__ The series is divergent; thc:reforc we may be -'. I!.ticT. Bc:I1 able to do something with it. O. Hcavisidc Mathematics is a tool for thought. A highly necessary tool in a world when: both feedback and non- linearities abound. Similarly. all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statcmalts as: 'One service topology has rendered mathematical physics ...*; 'One service logic has rendered c0m- puter science ...'; 'One service category theory has rendered mathematics ...'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series. This series, Mathematics and Its Applications. started in 19n. Now that over one hundred volumes have appeared it seems opportune to reexamine its scope. At the time I wrote "Growing specialization and diversification have brought a host of monographs and textbooks on increasingly specialized topics. However. the 'tree' of knowledge of mathematics and related fields does not grow only by putting forth new branc:hes. It also happens, quite often in fact, that branches which were thought to be completely.
For almost fifty years, Richard M. Dudley has been extremely influential in the development of several areas of Probability. His work on Gaussian processes led to the understanding of the basic fact that their sample boundedness and continuity should be characterized in terms of proper measures of complexity of their parameter spaces equipped with the intrinsic covariance metric. His sufficient condition for sample continuity in terms of metric entropy is widely used and was proved by X. Fernique to be necessary for stationary Gaussian processes, whereas its more subtle versions (majorizing measures) were proved by M. Talagrand to be necessary in general. Together with V. N. Vapnik and A. Y. Cervonenkis, R. M. Dudley is a founder of the modern theory of empirical processes in general spaces. His work on uniform central limit theorems (under bracketing entropy conditions and for Vapnik-Cervonenkis classes), greatly extends classical results that go back to A. N. Kolmogorov and M. D. Donsker, and became the starting point of a new line of research, continued in the work of Dudley and others, that developed empirical processes into one of the major tools in mathematical statistics and statistical learning theory. As a consequence of Dudley's early work on weak convergence of probability measures on non-separable metric spaces, the Skorohod topology on the space of regulated right-continuous functions can be replaced, in the study of weak convergence of the empirical distribution function, by the supremum norm. In a further recent step Dudley replaces this norm by the stronger p-variation norms, which then allows replacing compact differentiability of many statistical functionals by Fr chet differentiability in the delta method. Richard M. Dudley has also made important contributions to mathematical statistics, the theory of weak convergence, relativistic Markov processes, differentiability of nonlinear operators and several other areas of mathematics. Professor Dudley has been the adviser to thirty PhD's and is a Professor of Mathematics at the Massachusetts Institute of Technology.
Identifying the sources and measuring the impact of haphazard variations are important in any number of research applications, from clinical trials and genetics to industrial design and psychometric testing. Only in very simple situations can such variations be represented effectively by independent, identically distributed random variables or by random sampling from a hypothetical infinite population.
"Examines classic algorithms, geometric diagrams, and mechanical principles for enhances visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming."
This second edition of "A Beginner's Guide to Finite Mathematics" takes a distinctly applied approach to finite mathematics at the freshman and sophomore level. Topics are presented sequentially: the book opens with a brief review of sets and numbers, followed by an introduction to data sets, histograms, means and medians. Counting techniques and the Binomial Theorem are covered, which provides the foundation for elementary probability theory; this, in turn, leads to basic statistics. This new edition includes chapters on game theory and financial mathematics. Requiring little mathematical background beyond high school algebra, the text will be especially useful for business and liberal arts majors.
"Configural Frequency Analysis" (CFA) provides an up-to-the-minute
comprehensive introduction to its techniques, models, and
applications. Written in a formal yet accessible style, actual
empirical data examples are used to illustrate key concepts.
Step-by-step program sequences are used to show readers how to
employ CFA methods using commercial software packages, such as SAS,
SPSS, SYSTAT, S-Plus, or those written specifically to perform CFA.
"Configural Frequency Analysis" (CFA) provides an up-to-the-minute
comprehensive introduction to its techniques, models, and
applications. Written in a formal yet accessible style, actual
empirical data examples are used to illustrate key concepts.
Step-by-step program sequences are used to show readers how to
employ CFA methods using commercial software packages, such as SAS,
SPSS, SYSTAT, S-Plus, or those written specifically to perform CFA.
|
You may like...
Integrated Population Biology and…
Arni S.R. Srinivasa Rao, C.R. Rao
Hardcover
R6,219
Discovery Miles 62 190
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu
Paperback
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
|