![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
First published in 2000. Routledge is an imprint of Taylor & Francis, an informa company.
This handbook presents a systematic overview of approaches to, diversity, and problems involved in interdisciplinary rating methodologies. Historically, the purpose of ratings is to achieve information transparency regarding a given body's activities, whether in the field of finance, banking, or sports for example. This book focuses on commonly used rating methods in three important fields: finance, sports, and the social sector. In the world of finance, investment decisions are largely shaped by how positively or negatively economies or financial instruments are rated. Ratings have thus become a basis of trust for investors. Similarly, sports evaluation and funding are largely based on core ratings. From local communities to groups of nations, public investment and funding are also dependent on how these bodies are continuously rated against expected performance targets. As such, ratings need to reflect the consensus of all stakeholders on selected aspects of the work and how to evaluate their success. The public should also have the opportunity to participate in this process. The authors examine current rating approaches from a variety of proposals that are closest to the public consensus, analyzing the rating models and summarizing the methods of their construction. This handbook offers a valuable reference guide for managers, analysts, economists, business informatics specialists, and researchers alike.
"Stochastic Tools in Mathematics and Science" covers basic stochastic tools used in physics, chemistry, engineering and the life sciences. The topics covered include conditional expectations, stochastic processes, Brownian motion and its relation to partial differential equations, Langevin equations, the Liouville and Fokker-Planck equations, as well as Markov chain Monte Carlo algorithms, renormalization, basic statistical mechanics, and generalized Langevin equations and the Mori-Zwanzig formalism. The applications include sampling algorithms, data assimilation, prediction from partial data, spectral analysis, and turbulence. The book is based on lecture notes from a class that has attracted graduate and advanced undergraduate students from mathematics and from many other science departments at the University of California, Berkeley. Each chapter is followed by exercises. The book will be useful for scientists and engineers working in a wide range of fields and applications. For this new edition the material has been thoroughly reorganized and updated, and new sections on scaling, sampling, filtering and data assimilation, based on recent research, have been added. There are additional figures and exercises. Review of earlier edition: "This is an excellent concise textbook which can be used for self-study by graduate and advanced undergraduate students and as a recommended textbook for an introductory course on probabilistic tools in science." Mathematical Reviews, 2006
Collecting together contributed lectures and mini-courses, this book details the research presented in a special semester titled "Geometric mechanics - variational and stochastic methods" run in the first half of 2015 at the Centre Interfacultaire Bernoulli (CIB) of the Ecole Polytechnique Federale de Lausanne. The aim of the semester was to develop a common language needed to handle the wide variety of problems and phenomena occurring in stochastic geometric mechanics. It gathered mathematicians and scientists from several different areas of mathematics (from analysis, probability, numerical analysis and statistics, to algebra, geometry, topology, representation theory, and dynamical systems theory) and also areas of mathematical physics, control theory, robotics, and the life sciences, with the aim of developing the new research area in a concentrated joint effort, both from the theoretical and applied points of view. The lectures were given by leading specialists in different areas of mathematics and its applications, building bridges among the various communities involved and working jointly on developing the envisaged new interdisciplinary subject of stochastic geometric mechanics.
A ground-breaking and practical treatment of probability and stochastic processes "A Modern Theory of Random Variation" is a new and radical re-formulation of the mathematical underpinnings of subjects as diverse as investment, communication engineering, and quantum mechanics. Setting aside the classical theory of probability measure spaces, the book utilizes a mathematically rigorous version of the theory of random variation that bases itself exclusively on finitely additive probability distribution functions. In place of twentieth century Lebesgue integration and measure theory, the author uses the simpler concept of Riemann sums, and the non-absolute Riemann-type integration of Henstock. Readers are supplied with an accessible approach to standard elements of probability theory such as the central limmit theorem and Brownian motion as well as remarkable, new results on Feynman diagrams and stochastic integrals. Throughout the book, detailed numerical demonstrations accompany the discussions of abstract mathematical theory, from the simplest elements of the subject to the most complex. In addition, an array of numerical examples and vivid illustrations showcase how the presented methods and applications can be undertaken at various levels of complexity. "A Modern Theory of Random Variation" is a suitable book for courses on mathematical analysis, probability theory, and mathematical finance at the upper-undergraduate and graduate levels. The book is also an indispensible resource for researchers and practitioners who are seeking new concepts, techniques and methodologies in data analysis, numerical calculation, and financial asset valuation. Patrick Muldowney, PhD, served as lecturer at the Magee Business School of the UNiversity of Ulster for over twenty years. Dr. Muldowney has published extensively in his areas of research, including integration theory, financial mathematics, and random variation.
This book presents the proceedings of the international conference Particle Systems and Partial Differential Equations I, which took place at the Centre of Mathematics of the University of Minho, Braga, Portugal, from the 5th to the 7th of December, 2012. The purpose of the conference was to bring together world leaders to discuss their topics of expertise and to present some of their latest research developments in those fields. Among the participants were researchers in probability, partial differential equations and kinetics theory. The aim of the meeting was to present to a varied public the subject of interacting particle systems, its motivation from the viewpoint of physics and its relation with partial differential equations or kinetics theory and to stimulate discussions and possibly new collaborations among researchers with different backgrounds. The book contains lecture notes written by Francois Golse on the derivation of hydrodynamic equations (compressible and incompressible Euler and Navier-Stokes) from the Boltzmann equation, and several short papers written by some of the participants in the conference. Among the topics covered by the short papers are hydrodynamic limits; fluctuations; phase transitions; motions of shocks and anti shocks in exclusion processes; large number asymptotics for systems with self-consistent coupling; quasi-variational inequalities; unique continuation properties for PDEs and others. The book will benefit probabilists, analysts and mathematicians who are interested in statistical physics, stochastic processes, partial differential equations and kinetics theory, along with physicists."
This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors' website.
Since the early eighties, Ali Suleyman Ustunelhas beenone of the
main contributors to the field of Malliavin calculus. In a workshop
held in Paris, June 2010 several prominent researchers gave
exciting talks in honor of his 60th birthday. The present volume
includes scientific contributions from this workshop.
In real-life decision-making situations it is necessary to make decisions with incomplete information, for oftentimes uncertain results. In "Decision-Making Under Uncertainty," Dr. Chacko applies his years of statistical research and experience to the analysis of twenty-four real-life decision-making situations, both those with few data points (eg: Cuban Missile Crisis), and many data points (eg: aspirin for heart attack prevention). These situations encompass decision-making in a variety of business, social and political, physical and biological, and military environments. Though different, all of these have one characteristic in common: their outcomes are uncertain/unkown, and unknowable. Chacko Demonstrates how the decision-maker can reduce uncertainty by choosing probable outcomes using the statistical methods he introduces. This detailed volume develops standard statistical concepts (t, x2, normal distribution, ANOVA), and the less familiar concepts (logical probability, subjective probability, Bayesian Inference, Penalty for Non-Fulfillment, Bluff-Threats Matrix, etc.). Chacko also offers a thorough discussion of the underlying theoretical principles. The end of each chapter contains a set of questions, three quarters of which focus on concepts, formulation, conclusion, resource commitments, and caveats; only one quarter with computations. Ideal for the practitioner, the work is also designed to serve as the primary text for graduate or advanced undergraduate courses in statistics and decision science.
This book introduces the ade4 package for R which provides multivariate methods for the analysis of ecological data. It is implemented around the mathematical concept of the duality diagram, and provides a unified framework for multivariate analysis. The authors offer a detailed presentation of the theoretical framework of the duality diagram and also of its application to real-world ecological problems. These two goals may seem contradictory, as they concern two separate groups of scientists, namely statisticians and ecologists. However, statistical ecology has become a scientific discipline of its own, and the good use of multivariate data analysis methods by ecologists implies a fair knowledge of the mathematical properties of these methods. The organization of the book is based on ecological questions, but these questions correspond to particular classes of data analysis methods. The first chapters present both usual and multiway data analysis methods. Further chapters are dedicated for example to the analysis of spatial data, of phylogenetic structures, and of biodiversity patterns. One chapter deals with multivariate data analysis graphs. In each chapter, the basic mathematical definitions of the methods and the outputs of the R functions available in ade4 are detailed in two different boxes. The text of the book itself can be read independently from these boxes. Thus the book offers the opportunity to find information about the ecological situation from which a question raises alongside the mathematical properties of methods that can be applied to answer this question, as well as the details of software outputs. Each example and all the graphs in this book come with executable R code.
Presents a unique study of Integrative Problem-Solving (IPS). The consideration of 'Decadence' is essential in the scientific study of environmental and other problems and their rigorous solution, because the broad context within which the problems emerge can affect their solution. Stochastic reasoning underlines the conceptual and methodological framework of IPS, and its formulation has a mathematical life of its own that accounts for the multidisciplinarity of real world problems, the multisourced uncertainties characterizing their solution, and the different thinking modes of the people involved. Only by interpolating between the full range of disciplines (including stochastic mathematics, physical science, neuropsychology, philosophy, and sociology) and the associated thinking modes can scientists arrive at a satisfactory account of problem-solving, and be able to distinguish between a technically complete problem-solution, and a solution that has social impact.
This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available
software hassimplified the use ofBayesian and hierarchicalmodels .
One obstacle remains for ecologists and wildlife biologists, namely
the near absence of Bayesian texts written specifically for them.
The book includes many relevant examples, is supported by software
and examples on a companion website and will become an essential
grounding in this approachforstudents and research
ecologists.
Aside from distribution theory, projections and the singular value decomposition (SVD) are the two most important concepts for understanding the basic mechanism of multivariate analysis. The former underlies the least squares estimation in regression analysis, which is essentially a projection of one subspace onto another, and the latter underlies principal component analysis, which seeks to find a subspace that captures the largest variability in the original space. This book is about projections and SVD. A thorough discussion of generalized inverse (g-inverse) matrices is also given because it is closely related to the former. The book provides systematic and in-depth accounts of these concepts from a unified viewpoint of linear transformations finite dimensional vector spaces. More specially, it shows that projection matrices (projectors) and g-inverse matrices can be defined in various ways so that a vector space is decomposed into a direct-sum of (disjoint) subspaces. Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition will be useful for researchers, practitioners, and students in applied mathematics, statistics, engineering, behaviormetrics, and other fields.
Baseball fans are often passionate about statistics, but true numbers fanatics want to go beyond the 'baseball card' stats and make comparisons through other objective means. ""Sabermetrics"" uses algebra to expand on statistics and measure a player's value to his team and how he ranks among players of different eras. The mathematical models in this book, a follow-up to ""Understanding Sabermetrics"" (2008), define the measures, supply examples, and provide practice problems for readers.
Sensor Data Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output in accuracy, reliability, or cost. This book provides an introduction Sensor Data Fusion, as an information technology as well as a branch of engineering science and informatics. Part I presents a coherent methodological framework, thus providing the prerequisites for discussing selected applications in Part II of the book. The presentation mirrors the author's views on the subject and emphasizes his own contributions to the development of particular aspects. With some delay, Sensor Data Fusion is likely to develop along lines similar to the evolution of another modern key technology whose origin is in the military domain, the Internet. It is the author's firm conviction that until now, scientists and engineers have only scratched the surface of the vast range of opportunities for research, engineering, and product development that still waits to be explored: the Internet of the Sensors.
This volume presents the latest advances and trends in nonparametric statistics, and gathers selected and peer-reviewed contributions from the 3rd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Avignon, France on June 11-16, 2016. It covers a broad range of nonparametric statistical methods, from density estimation, survey sampling, resampling methods, kernel methods and extreme values, to statistical learning and classification, both in the standard i.i.d. case and for dependent data, including big data. The International Society for Nonparametric Statistics is uniquely global, and its international conferences are intended to foster the exchange of ideas and the latest advances among researchers from around the world, in cooperation with established statistical societies such as the Institute of Mathematical Statistics, the Bernoulli Society and the International Statistical Institute. The 3rd ISNPS conference in Avignon attracted more than 400 researchers from around the globe, and contributed to the further development and dissemination of nonparametric statistics knowledge.
This book is devoted to the scientific legacy of Professor Victor Ambartsumian - one of the distinguished scientists of the last century. He obtained very essential results not only in astrophysics, but also in mathematics and theoretical physics. One can recall his fundamental results concerning the Sturm-Liouville inverse problem, quantum field theory, structure of atomic nuclei etc. Nevertheless, his revolutionary ideas in astrophysics and corresponding results are known more widely and have predetermined the further development of this science. The concept about the activity phenomena and objects' evolution, particularly, determination of the age of our Galaxy, ideas about the stars' formation nowadays in stellar associations, the activity of galactic nuclei appeared to be exceptionally fruitful. These directions are being elaborated at many astronomical centers all over the world.
This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a "theorem-proof" format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible, major supporting concepts and specialized tables are appended at the end of every chapter. In addition, each chapter concludes with a set of key terms and concepts, as well as a summary of the main findings. Lastly, the book offers numerous theoretical and empirical exercises, with answers provided by the author in an extensive solutions manual.
This book presents various recently developed and traditional statistical techniques, which are increasingly being applied in social science research. The social sciences cover diverse phenomena arising in society, the economy and the environment, some of which are too complex to allow concrete statements; some cannot be defined by direct observations or measurements; some are culture- (or region-) specific, while others are generic and common. Statistics, being a scientific method - as distinct from a 'science' related to any one type of phenomena - is used to make inductive inferences regarding various phenomena. The book addresses both qualitative and quantitative research (a combination of which is essential in social science research) and offers valuable supplementary reading at an advanced level for researchers.
In this thesis, the author develops numerical techniques for tracking and characterising the convoluted nodal lines in three-dimensional space, analysing their geometry on the small scale, as well as their global fractality and topological complexity---including knotting---on the large scale. The work is highly visual, and illustrated with many beautiful diagrams revealing this unanticipated aspect of the physics of waves. Linear superpositions of waves create interference patterns, which means in some places they strengthen one another, while in others they completely cancel each other out. This latter phenomenon occurs on 'vortex lines' in three dimensions. In general wave superpositions modelling e.g. chaotic cavity modes, these vortex lines form dense tangles that have never been visualised on the large scale before, and cannot be analysed mathematically by any known techniques.
This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.
The research articles in this volume cover timely quantitative psychology topics, including new methods in item response theory, computerized adaptive testing, cognitive diagnostic modeling, and psychological scaling. Topics within general quantitative methodology include structural equation modeling, factor analysis, causal modeling, mediation, missing data methods, and longitudinal data analysis. These methods will appeal, in particular, to researchers in the social sciences. The 80th annual meeting took place in Beijing, China, between the 12th and 16th of July, 2015. Previous volumes to showcase work from the Psychometric Society's Meeting are New Developments in Quantitative Psychology: Presentations from the 77th Annual Psychometric Society Meeting (Springer, 2013), Quantitative Psychology Research: The 78th Annual Meeting of the Psychometric Society (Springer, 2015), and Quantitative Psychology Research: The 79th Annual Meeting of the Psychometric Society, Wisconsin, USA, 2014 (Springer, 2015). |
You may like...
Distributed Real-Time Architecture for…
Hamidreza Ahmadian, Roman Obermaisser, …
Hardcover
R4,940
Discovery Miles 49 400
|