![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Probability & statistics
This relevant and timely thesis presents the pioneering use of risk-based assessment tools to analyse the interaction between electrical and mechanical systems in mixed AC/DC power networks at subsynchronous frequencies. It also discusses assessing the effect of uncertainties in the mechanical parameters of a turbine generator on SSR in a meshed network with both symmetrical and asymmetrical compensation systems. The research presented has resulted in 12 publications including three top international journal papers (IEEE Transactions on Power Systems) and nine international conference publications, including two award-winning papers.
Researchers and students who use empirical investigation in their work must go through the process of selecting statistical methods for analyses, and they are often challenged to justify these selections. This book is designed for readers with limited background in statistical methodology who seek guidance in defending their statistical decision-making in the worlds of research and practice. It is devoted to helping students and scholars find the information they need to select data analytic methods, and to speak knowledgeably about their statistical research processes. Each chapter opens with a conundrum relating to the selection of an analysis, or to explaining the nature of an analysis. Throughout the chapter, the analysis is described, along with some guidance in justifying the choices of that particular method. Designed to offer statistical knowledge to the non-specialist, this volume can be used in courses on research methods, or for courses on statistical applications to biological, medical, life, social, or physical sciences. It will also be useful to academic and industrial researchers in engineering and in the physical sciences who will benefit from a stronger understanding of how to analyze empirical data. The book is written for those with foundational education in calculus. However, a brief review of fundamental concepts of probability and statistics, together with a primer on some concepts in elementary calculus and matrix algebra, is included. R code and sample datasets are provided.
This book provides a self-contained review of all the relevant topics in probability theory. A software package called MAXIM, which runs on MATLAB, is made available for downloading. Vidyadhar G. Kulkarni is Professor of Operations Research at the University of North Carolina at Chapel Hill.
This undergraduate text distils the wisdom of an experienced
teacher and yields, to the mutual advantage of students and their
instructors, a sound and stimulating introduction to probability
theory. The accent is on its essential role in statistical theory
and practice, built on the use of illustrative examples and the
solution of problems from typical examination papers.
Mathematically-friendly for first and second year undergraduate
students, the book is also a reference source for workers in a wide
range of disciplines who are aware that even the simpler aspects of
probability theory are not simple.
This book provides a fresh approach to reliability theory, an area that has gained increasing relevance in fields from statistics and engineering to demography and insurance. Its innovative use of quantile functions gives an analysis of lifetime data that is generally simpler, more robust, and more accurate than the traditional methods, and opens the door for further research in a wide variety of fields involving statistical analysis. In addition, the book can be used to good effect in the classroom as a text for advanced undergraduate and graduate courses in Reliability and Statistics.
This book reports a literature review on kaizen, its industrial applications, critical success factors, benefits gained, journals that publish about it, main authors (research groups) and universities. Kaizen is treated in this book in three stages: planning, implementation and control. The authors provide a questionnaire designed with activities in every stage, highlighting the benefits gained in each stage. The study has been applied to more than 400 managers and leaders in continuous improvement in Mexican maquiladoras. A univariate analysis is provided to the activities in every stage. Moreover, structural equation models associating those activities with the benefits gained are presented for a statistical validation. Such a relationship between activities and benefits helps managers to identify the most important factor affecting their benefits and financial income.
This book presents the state of the art in multilevel analysis, with an emphasis on more advanced topics. These topics are discussed conceptually, analyzed mathematically, and illustrated by empirical examples. Multilevel analysis is the statistical analysis of hierarchically and non-hierarchically nested data. The simplest example is clustered data, such as a sample of students clustered within schools. Multilevel data are especially prevalent in the social and behavioral sciences and in the biomedical sciences. The chapter authors are all leading experts in the field. Given the omnipresence of multilevel data in the social, behavioral, and biomedical sciences, this book is essential for empirical researchers in these fields.
Nonlinear models have been used extensively in the areas of economics and finance. Recent literature on the topic has shown that a large number of series exhibit nonlinear dynamics as opposed to the alternative--linear dynamics. Incorporating these concepts involves deriving and estimating nonlinear time series models, and these have typically taken the form of Threshold Autoregression (TAR) models, Exponential Smooth Transition (ESTAR) models, and Markov Switching (MS) models, among several others. This edited volume provides a timely overview of nonlinear estimation techniques, offering new methods and insights into nonlinear time series analysis. It features cutting-edge research from leading academics in economics, finance, and business management, and will focus on such topics as Zero-Information-Limit-Conditions, using Markov Switching Models to analyze economics series, and how best to distinguish between competing nonlinear models. Principles and techniques in this book will appeal to econometricians, finance professors teaching quantitative finance, researchers, and graduate students interested in learning how to apply advances in nonlinear time series modeling to solve complex problems in economics and finance.
Selected papers submitted by participants of the international Conference "Stochastic Analysis and Applied Probability 2010" ( www.saap2010.org ) make up the basis of this volume. The SAAP 2010 was held in Tunisia, from 7-9 October, 2010, and was organized by the "Applied Mathematics & Mathematical Physics" research unit of the preparatory institute to the military academies of Sousse (Tunisia), chaired by Mounir Zili. The papers cover theoretical, numerical and applied aspects of stochastic processes and stochastic differential equations. The study of such topic is motivated in part by the need to model, understand, forecast and control the behavior of many natural phenomena that evolve in time in a random way. Such phenomena appear in the fields of finance, telecommunications, economics, biology, geology, demography, physics, chemistry, signal processing and modern control theory, to mention just a few. As this book emphasizes the importance of numerical and theoretical studies of the stochastic differential equations and stochastic processes, it will be useful for a wide spectrum of researchers in applied probability, stochastic numerical and theoretical analysis and statistics, as well as for graduate students. To make it more complete and accessible for graduate students, practitioners and researchers, the editors Mounir Zili and Daria Filatova have included a survey dedicated to the basic concepts of numerical analysis of the stochastic differential equations, written by Henri Schurz.
Most of the time series analysis methods applied today rely heavily on the key assumptions of linearity, Gaussianity and stationarity. Natural time series, including hydrologic, climatic and environmental time series, which satisfy these assumptions seem to be the exception rather than the rule. Nevertheless, most time series analysis is performed using standard methods after relaxing the required conditions one way or another, in the hope that the departure from these assumptions is not large enough to affect the result of the analysis. A large amount of data is available today after almost a century of intensive data collection of various natural time series. In addition to a few older data series such as sunspot numbers, sea surface temperatures, etc., data obtained through dating techniques (tree-ring data, ice core data, geological and marine deposits, etc.), are available. With the advent of powerful computers, the use of simplified methods can no longer be justified, especially with the success of these methods in explaining the inherent variability in natural time series. This book presents a number of new techniques that have been discussed in the literature during the last two decades concerning the investigation of stationarity, linearity and Gaussianity of hydrologic and environmental times series. These techniques cover different approaches for assessing nonstationarity, ranging from time domain analysis, to frequency domain analysis, to the combined time-frequency and time-scale analyses, to segmentation analysis, in addition to formal statistical tests of linearity and Gaussianity. It is hoped that this endeavor would facilitate further research into this important area.
This graduate textbook covers topics in statistical theory essential for graduate students preparing for work on a Ph.D. degree in statistics. The first chapter provides a quick overview of concepts and results in measure-theoretic probability theory that are usefulin statistics. The second chapter introduces some fundamental concepts in statistical decision theory and inference. Chapters 3-7 contain detailed studies on some important topics: unbiased estimation, parametric estimation, nonparametric estimation, hypothesis testing, and confidence sets. A large number of exercises in each chapter provide not only practice problems for students, but also many additional results. In addition to the classical results that are typically covered in a textbook of a similar level, this book introduces some topics in modern statistical theory that have been developed in recent years, such as Markov chain Monte Carlo, quasi-likelihoods, empirical likelihoods, statistical functionals, generalized estimation equations, the jackknife, and the bootstrap. Jun Shao is Professor of Statistics at the University of Wisconsin, Madison. Also available: Jun Shao and Dongsheng Tu, The Jackknife and Bootstrap, Springer- Verlag New York, Inc., 1995, Cloth, 536 pp., 0-387-94515-6.
Most data sets collected by researchers are multivariate, and in the majority of cases the variables need to be examined simultaneously to get the most informative results. This requires the use of one or other of the many methods of multivariate analysis, and the use of a suitable software package such as S-PLUS or R. In this book the core multivariate methodology is covered along with some basic theory for each method described. The necessary R and S-PLUS code is given for each analysis in the book, with any differences between the two highlighted. A website with all the datasets and code used in the book can be found at http: //biostatistics.iop.kcl.ac.uk/publications/everitt/. Graduate students, and advanced undergraduates on applied statistics courses, especially those in the social sciences, will find this book invaluable in their work, and it will also be useful to researchers outside of statistics who need to deal with the complexities of multivariate data in their work. Brian Everitt is Emeritus Professor of Statistics, Kinga (TM)s College, London.
Miller and Childers have focused on creating a clear presentation
of foundational concepts with specific applications to signal
processing and communications, clearly the two areas of most
interest to students and instructors in this course. It is aimed at
graduate students as well as practicing engineers, and includes
unique chapters on narrowband random processes and simulation
techniques.
The 2006 INFORMS Expository Writing Award-winning and best-selling author Sheldon Ross (University of Southern California) teams up with Erol Pekz (Boston University) to bring you this textbook for undergraduate and graduate students in statistics, mathematics, engineering, finance, and actuarial science. This is a guided tour designed to give familiarity with advanced topics in probability without having to wade through the exhaustive coverage of the classic advanced probability theory books. Topics include measure theory, limit theorems, bounding probabilities and expectations, coupling and Stein's method, martingales, Markov chains, renewal theory, and Brownian motion. No other text covers all these advanced topics rigorously but at such an accessible level; all you need is calculus and material from a first undergraduate course in probability.
This volume presents the latest developments in the highly active and rapidly growing field of diffusion MRI. The reader will find numerous contributions covering a broad range of topics, from the mathematical foundations of the diffusion process and signal generation, to new computational methods and estimation techniques for the in-vivo recovery of microstructural and connectivity features, as well as frontline applications in neuroscience research and clinical practice. These proceedings contain the papers presented at the 2017 MICCAI Workshop on Computational Diffusion MRI (CDMRI'17) held in Quebec, Canada on September 10, 2017, sharing new perspectives on the most recent research challenges for those currently working in the field, but also offering a valuable starting point for anyone interested in learning computational techniques in diffusion MRI. This book includes rigorous mathematical derivations, a large number of rich, full-colour visualisations and clinically relevant results. As such, it will be of interest to researchers and practitioners in the fields of computer science, MRI physics and applied mathematics.
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, control, and finance.
This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone's Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining RFVs through custom measurement functions. With numerous examples of applications, this book provides a comprehensive treatment of the RFV approach to uncertainty that is suitable for any graduate student or researcher with interests in the measurement field.
Computationally intensive methods have become widely used both for statistical inference and for exploratory analyses of data. The methods of computational statistics involve resampling, partitioning, and multiple transformations of a dataset. They may also make use of randomly generated artificial data. Implementation of these methods often requires advanced techniques in numerical analysis, so there is a close connection between computational statistics and statistical computing. This book describes techniques used in computational statistics, and addresses some areas of application of computationally intensive methods, such as density estimation, identification of structure in data, and model building. Although methods of statistical computing are not emphasized in this book, numerical techniques for transformations, for function approximation, and for optimization are explained in the context of the statistical methods. The book includes exercises, some with solutions. The book can be used as a text or supplementary text for various courses in modern statistics at the advanced undergraduate or graduate level, and it can also be used as a reference for statisticians who use computationally-intensive methods of analysis. Although some familiarity with probability and statistics is assumed, the book reviews basic methods of inference, and so is largely self-contained. James Gentle is University Professor of Computational Statistics at George Mason University. He is a Fellow of the American Statistical Association and a member of the International Statistical Institute. He has held several national offices in the American Statistical Association and has served as associate editor for journals of the ASA as well as for other journals in statistics and computing. He is the author of Random Number Generation and Monte Carlo Methods and Numerical Linear Algebra for Statistical Applications.
In contrast to the prevailing tradition in epistemology, the focus in this book is on low-level inferences, i.e., those inferences that we are usually not consciously aware of and that we share with the cat nearby which infers that the bird which she sees picking grains from the dirt, is able to fly. Presumably, such inferences are not generated by explicit logical reasoning, but logical methods can be used to describe and analyze such inferences. Part 1 gives a purely system-theoretic explication of belief and inference. Part 2 adds a reliabilist theory of justification for inference, with a qualitative notion of reliability being employed. Part 3 recalls and extends various systems of deductive and nonmonotonic logic and thereby explains the semantics of absolute and high reliability. In Part 4 it is proven that qualitative neural networks are able to draw justified deductive and nonmonotonic inferences on the basis of distributed representations. This is derived from a soundness/completeness theorem with regard to cognitive semantics of nonmonotonic reasoning. The appendix extends the theory both logically and ontologically, and relates it to A. Goldman's reliability account of justified belief.
This edited volume addresses the importance of mathematics for industry and society by presenting highlights from contract research at the Department of Applied Mathematics at SINTEF, the largest independent research organization in Scandinavia. Examples range from computer-aided geometric design, via general purpose computing on graphics cards, to reservoir simulation for enhanced oil recovery. Contributions are written in a tutorial style.
The theory of random processes is an integral part of the analysis and synthesis of complex engineering systems. This textbook systematically presents the fundamentals of statistical dynamics and reliability theory. The theory of Markovian processes used during the analysis of random dynamic processes in mechanical systems is described in detail. Examples are machines, instruments and structures loaded with perturbations. The reliability and lifetime of those objects depend on how properly these perturbations are taken into account. Random vibrations with finite and infinite numbers of degrees of freedom are analyzed as well as the theory and numerical methods of non-stationary processes under the conditions of statistical indeterminacy. This textbook is addressed to students and post-graduates of technical universities. It can also be useful to lecturers and mechanical engineers, including designers in different industries.
The book is meant to serve two purposes. The first and more obvious
one is to present state of the art results in algebraic research
into residuated structures related to substructural logics. The
second, less obvious but equally important, is to provide a
reasonably gentle introduction to algebraic logic. At the
beginning, the second objective is predominant. Thus, in the first
few chapters the reader will find a primer of universal algebra for
logicians, a crash course in nonclassical logics for algebraists,
an introduction to residuated structures, an outline of
Gentzen-style calculi as well as some titbits of proof theory - the
celebrated Hauptsatz, or cut elimination theorem, among them. These
lead naturally to a discussion of interconnections between logic
and algebra, where we try to demonstrate how they form two sides of
the same coin. We envisage that the initial chapters could be used
as a textbook for a graduate course, perhaps entitled Algebra and
Substructural Logics.
This monograph is devoted to the systematic presentation of foundations of the quantum field theory. Unlike numerous monographs devoted to this topic, a wide range of problems covered in this book are accompanied by their sufficiently clear interpretations and applications. An important significant feature of this monograph is the desire of the author to present mathematical problems of the quantum field theory with regard to new methods of the constructive and Euclidean field theory that appeared in the last thirty years of the 20th century and are based on the rigorous mathematical apparatus of functional analysis, the theory of operators, and the theory of generalized functions. The monograph is useful for students, post-graduate students, and young scientists who desire to understand not only the formality of construction of the quantum field theory but also its essence and connection with the classical mechanics, relativistic classical field theory, quantum mechanics, group theory, and the theory of path integral formalism. |
![]() ![]() You may like...
Reel Masters - Chefs Casting about with…
Susan Schadt
Hardcover
|