![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
A critical yet constructive description of the rich analytical techniques and substantive applications that typify how statistical thinking has been applied at the RAND Corporation over the past two decades. Case studies of public policy problems are useful for teaching because they are familiar: almost everyone knows something abut health insurance, global warming, and capital punishment, to name but a few of the applications covered in this casebook. Each case study has a common format that describes the policy questions, the statistical questions, and the successful and the unsuccessful analytic strategies. Readers should be familiar with basic statistical concepts including sampling and regression. While designed for statistics courses in areas ranging from economics to health policy to the law at both the advanced undergraduate and graduate levels, empirical researchers and policy-makers will also find this casebook informative.
Stochastic analysis is a field of mathematical research having numerous interactions with other domains of mathematics such as partial differential equations, riemannian path spaces, dynamical systems, optimization. It also has many links with applications in engineering, finance, quantum physics, and other fields. This book covers recent and diverse aspects of stochastic and infinite-dimensional analysis. The included papers are written from a variety of standpoints (white noise analysis, Malliavin calculus, quantum stochastic calculus) by the contributors, and provide a broad coverage of the subject. This volume will be useful to graduate students and research mathematicians wishing to get acquainted with recent developments in the field of stochastic analysis.
Copulas are functions that join multivariate distribution functions to their one-dimensional margins. The study of copulas and their role in statistics is a new but vigorously growing field. In this book the student or practitioner of statistics and probability will find discussions of the fundamental properties of copulas and some of their primary applications. The applications include the study of dependence and measures of association, and the construction of families of bivariate distributions. With nearly a hundred examples and over 150 exercises, this book is suitable as a text or for self-study. The only prerequisite is an upper level undergraduate course in probability and mathematical statistics, although some familiarity with nonparametric statistics would be useful. Knowledge of measure-theoretic probability is not required. Roger B. Nelsen is Professor of Mathematics at Lewis & Clark College in Portland, Oregon. He is also the author of "Proofs Without Words: Exercises in Visual Thinking," published by the Mathematical Association of America.
The purpose of this book is to honor the fundamental contributions to many different areas of statistics made by Barry Arnold. Distinguished and active researchers highlight some of the recent developments in statistical distribution theory, order statistics and their properties, as well as inferential methods associated with them. Applications to survival analysis, reliability, quality control, and environmental problems are emphasized.
Stochastic geometry is the branch of mathematics that studies geometric structures associated with random configurations, such as random graphs, tilings and mosaics. Due to its close ties with stereology and spatial statistics, the results in this area are relevant for a large number of important applications, e.g. to the mathematical modeling and statistical analysis of telecommunication networks, geostatistics and image analysis. In recent years - due mainly to the impetus of the authors and their collaborators - a powerful connection has been established between stochastic geometry and the Malliavin calculus of variations, which is a collection of probabilistic techniques based on the properties of infinite-dimensional differential operators. This has led in particular to the discovery of a large number of new quantitative limit theorems for high-dimensional geometric objects. This unique book presents an organic collection of authoritative surveys written by the principal actors in this rapidly evolving field, offering a rigorous yet lively presentation of its many facets.
Numerical methods in finance have emerged as a vital field at the crossroads of probability theory, finance and numerical analysis. Based on presentations given at the workshop Numerical Methods in Finance held at the INRIA Bordeaux (France) on June 1-2, 2010, this book provides an overview of the major new advances in the numerical treatment of instruments with American exercises. Naturally it covers the most recent research on the mathematical theory and the practical applications of optimal stopping problems as they relate to financial applications. By extension, it also provides an original treatment of Monte Carlo methods for the recursive computation of conditional expectations and solutions of BSDEs and generalized multiple optimal stopping problems and their applications to the valuation of energy derivatives and assets. The articles were carefully written in a pedagogical style and a reasonably self-contained manner. The book is geared toward quantitative analysts, probabilists, and applied mathematicians interested in financial applications.
..".the text is user friendly to the topics it considers and should be very accessible...Instructors and students of statistical measure theoretic courses will appreciate the numerous informative exercises; helpful hints or solution outlines are given with many of the problems. All in all, the text should make a useful reference for professionals and students."-The Journal of the American Statistical Association
Since the beginning of the seventies computer hardware is available to use programmable computers for various tasks. During the nineties the hardware has developed from the big main frames to personal workstations. Nowadays it is not only the hardware which is much more powerful, but workstations can do much more work than a main frame, compared to the seventies. In parallel we find a specialization in the software. Languages like COBOL for business orientated programming or Fortran for scientific computing only marked the beginning. The introduction of personal computers in the eighties gave new impulses for even further development, already at the beginning of the seven ties some special languages like SAS or SPSS were available for statisticians. Now that personal computers have become very popular the number of pro grams start to explode. Today we will find a wide variety of programs for almost any statistical purpose (Koch & Haag 1995)."
This book deals with the theory and the applications of a new time domain, termed natural time domain, that has been forwarded by the authors almost a decade ago (P.A. Varotsos, N.V. Sarlis and E.S. Skordas, Practica of Athens Academy 76, 294-321, 2001; Physical Review E 66, 011902, 2002). In particular, it has been found that novel dynamical features hidden behind time series in complex systems can emerge upon analyzing them in this new time domain, which conforms to the desire to reduce uncertainty and extract signal information as much as possible. The analysis in natural time enables the study of the dynamical evolution of a complex system and identifies when the system enters a critical stage. Hence, natural time plays a key role in predicting impending catastrophic events in general. Relevant examples of data analysis in this new time domain have been published during the last decade in a large variety of fields, e.g., Earth Sciences, Biology and Physics. The book explains in detail a series of such examples including the identification of the sudden cardiac death risk in Cardiology, the recognition of electric signals that precede earthquakes, the determination of the time of an impending major mainshock in Seismology, and the analysis of the avalanches of the penetration of magnetic flux into thin films of type II superconductors in Condensed Matter Physics. In general, this book is concerned with the time-series analysis of signals emitted from complex systems by means of the new time domain and provides advanced students and research workers in diverse fields with a sound grounding in the fundamentals of current research work on detecting (long-range) correlations in complex time series. Furthermore, the modern techniques of Statistical Physics in time series analysis, for example Hurst analysis, the detrended fluctuation analysis, the wavelet transform etc., are presented along with their advantages when natural time domain is employed.
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."--The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible for today's statisticians to do more in less time than ever before. The Third Edition of this bestselling text reflects how the changes in the computing environment have transformed the way statistical analyses are performed today. Based on extensive input from university statistics departments throughout the country, the authors have made several important and timely revisions, including: Additional material on probability appears early in the text New sections on odds ratios, ratio and difference estimations, repeated measure analysis, and logistic regression New examples and exercises, many from the field of the health sciences Printouts of computer analyses on all complex procedures An accompanying Web site illustrating how to use SAS(R) and JMP(R) for all procedures The text features the most commonly used statistical techniques for the analysis of research data. As in the earlier editions, emphasis is placed on how to select the proper statistical procedure and how to interpret results. Whenever possible, to avoid using the computer as a "black box" that performs a mysteriousprocess on the data, actual computational procedures are also given. A must for scientists who analyze data, professionals and researchers who need a self-teaching text, and graduate students in statistical methods, Statistics for Research, Third Edition brings the methodology up to date in a very practical and accessible way.
The purpose of this volume is to provide an overview of Terry Speed's contributions to statistics and beyond. Each of the fifteen chapters concerns a particular area of research and consists of a commentary by a subject-matter expert and selection of representative papers. The chapters, organized more or less chronologically in terms of Terry's career, encompass a wide variety of mathematical and statistical domains, along with their application to biology and medicine. Accordingly, earlier chapters tend to be more theoretical, covering some algebra and probability theory, while later chapters concern more recent work in genetics and genomics. The chapters also span continents and generations, as they present research done over four decades, while crisscrossing the globe. The commentaries provide insight into Terry's contributions to a particular area of research, by summarizing his work and describing its historical and scientific context, motivation, and impact. In addition to shedding light on Terry's scientific achievements, the commentaries reveal endearing aspects of his personality, such as his intellectual curiosity, energy, humor, and generosity.
This book contains a rich set of tools for nonparametric analyses, and the purpose of this text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses and tests using R to broadly compare differences between data sets and statistical approach.
Statistical Analysis of Observations of Increasing Dimension is devoted to the investigation of the limit distribution of the empirical generalized variance, covariance matrices, their eigenvalues and solutions of the system of linear algebraic equations with random coefficients, which are an important function of observations in multidimensional statistical analysis. A general statistical analysis is developed in which observed random vectors may not have density and their components have an arbitrary dependence structure. The methods of this theory have very important advantages in comparison with existing methods of statistical processing. The results have applications in nuclear and statistical physics, multivariate statistical analysis in the theory of the stability of solutions of stochastic differential equations, in control theory of linear stochastic systems, in linear stochastic programming, in the theory of experiment planning.
Praise for the First Edition "If you ... want an up-to-date, definitive reference written by authors who have contributed much to this field, then this book is an essential addition to your library." --Journal of the American Statistical Association A COMPREHENSIVE REVIEW OF MODERN EXPERIMENTAL DESIGN Experiments: Planning, Analysis, and Optimization, Third Edition provides a complete discussion of modern experimental design for product and process improvement--the design and analysis of experiments and their applications for system optimization, robustness, and treatment comparison. While maintaining the same easy-to-follow style as the previous editions, this book continues to present an integrated system of experimental design and analysis that can be applied across various fields of research including engineering, medicine, and the physical sciences. New chapters provide modern updates on practical optimal design and computer experiments, an explanation of computer simulations as an alternative to physical experiments. Each chapter begins with a real-world example of an experiment followed by the methods required to design that type of experiment. The chapters conclude with an application of the methods to the experiment, bridging the gap between theory and practice. The authors modernize accepted methodologies while refining many cutting-edge topics including robust parameter design, analysis of non-normal data, analysis of experiments with complex aliasing, multilevel designs, minimum aberration designs, and orthogonal arrays. The third edition includes: Information on the design and analysis of computer experiments A discussion of practical optimal design of experiments An introduction to conditional main effect (CME) analysis and definitive screening designs (DSDs) New exercise problems This book includes valuable exercises and problems, allowing the reader to gauge their progress and retention of the book's subject matter as they complete each chapter. Drawing on examples from their combined years of working with industrial clients, the authors present many cutting-edge topics in a single, easily accessible source. Extensive case studies, including goals, data, and experimental designs, are also included, and the book's data sets can be found on a related FTP site, along with additional supplemental material. Chapter summaries provide a succinct outline of discussed methods, and extensive appendices direct readers to resources for further study. Experiments: Planning, Analysis, and Optimization, Third Edition is an excellent book for design of experiments courses at the upper-undergraduate and graduate levels. It is also a valuable resource for practicing engineers and statisticians.
A comprehensive account of the statistical theory of exponential families of stochastic processes. The book reviews the progress in the field made over the last ten years or so by the authors - two of the leading experts in the field - and several other researchers. The theory is applied to a broad spectrum of examples, covering a large number of frequently applied stochastic process models with discrete as well as continuous time. To make the reading even easier for statisticians with only a basic background in the theory of stochastic process, the first part of the book is based on classical theory of stochastic processes only, while stochastic calculus is used later. Most of the concepts and tools from stochastic calculus needed when working with inference for stochastic processes are introduced and explained without proof in an appendix. This appendix can also be used independently as an introduction to stochastic calculus for statisticians. Numerous exercises are also included.
Experience gained during a ten-year long involvement in modelling, program ming and application in nonlinear optimization helped me to arrive at the conclusion that in the interest of having successful applications and efficient software production, knowing the structure of the problem to be solved is in dispensable. This is the reason why I have chosen the field in question as the sphere of my research. Since in applications, mainly from among the nonconvex optimization models, the differentiable ones proved to be the most efficient in modelling, especially in solving them with computers, I started to deal with the structure of smooth optimization problems. The book, which is a result of more than a decade of research, can be equally useful for researchers and stu dents showing interest in the domain, since the elementary notions necessary for understanding the book constitute a part of the university curriculum. I in tended dealing with the key questions of optimization theory, which endeavour, obviously, cannot bear all the marks of completeness. What I consider the most crucial point is the uniform, differential geometric treatment of various questions, which provides the reader with opportunities for learning the structure in the wide range, within optimization problems. I am grateful to my family for affording me tranquil, productive circumstances. I express my gratitude to F."
The subject theory is important in finance, economics, investment strategies, health sciences, environment, industrial engineering, etc.
'Et moi, ..., si j'avait su comment en revenIT, One service mathematics has rendered the je n'y serais point allt\.' human race. It has put common sense back where it belongs, on the topmost shelf next Jules Verne to the dusty canister labelled 'discarded non- The series is divergent; therefore we may be sense'. able to do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non- linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. :; 'One service logic has rendered com- puter science .. :; 'One service category theory has rendered mathematics .. :. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series.
This volume contains twenty-eight refereed research or review papers presented at the 5th Seminar on Stochastic Processes, Random Fields and Applications, which took place at the Centro Stefano Franscini (Monte Verita) in Ascona, Switzerland, from May 30 to June 3, 2005. The seminar focused mainly on stochastic partial differential equations, random dynamical systems, infinite-dimensional analysis, approximation problems, and financial engineering. The book will be a valuable resource for researchers in stochastic analysis and professionals interested in stochastic methods in finance.
In recent years, there has been an upsurge of interest in using techniques drawn from probability to tackle problems in analysis. These applications arise in subjects such as potential theory, harmonic analysis, singular integrals, and the study of analytic functions. This book presents a modern survey of these methods at the level of a beginning Ph.D. student. Highlights of this book include the construction of the Martin boundary, probabilistic proofs of the boundary Harnack principle, Dahlberg's theorem, a probabilistic proof of Riesz' theorem on the Hilbert transform, and Makarov's theorems on the support of harmonic measure. The author assumes that a reader has some background in basic real analysis, but the book includes proofs of all the results from probability theory and advanced analysis required. Each chapter concludes with exercises ranging from the routine to the difficult. In addition, there are included discussions of open problems and further avenues of research.
An exploration of the use of smoothing methods in testing the fit of parametric regression models. The book reviews many of the existing methods for testing lack-of-fit and also proposes a number of new methods, addressing both applied and theoretical aspects of the model checking problems. As such, the book is of interest to practitioners of statistics and researchers investigating either lack-of-fit tests or nonparametric smoothing ideas. The first four chapters introduce the problem of estimating regression functions by nonparametric smoothers, primarily those of kernel and Fourier series type, and could be used as the foundation for a graduate level course on nonparametric function estimation. The prerequisites for a full appreciation of the book are a modest knowledge of calculus and some familiarity with the basics of mathematical statistics.
Around the world a multitude of surveys are conducted every day, on a variety of subjects, and consequently surveys have become an accepted part of modern life. However, in recent years survey estimates have been increasingly affected by rising trends in nonresponse, with loss of accuracy as an undesirable result. Whilst it is possible to reduce nonresponse to some degree, it cannot be completely eliminated. Estimation techniques that account systematically for nonresponse and at the same time succeed in delivering acceptable accuracy are much needed. "Estimation in Surveys with Nonresponse" provides an overview of these techniques, presenting the view of nonresponse as a normal (albeit undesirable) feature of a sample survey, one whose potentially harmful effects are to be minimised. Builds in the nonresponse feature of survey data collection as an integral part of the theory, both for point estimation and for variance estimation. Promotes weighting through calibration as a new and powerful technique for surveys with nonresponse. Highlights the analysis of nonresponse bias in estimates and methods to minimize this bias. Includes computational tools to help identify the best variables for calibration. Discusses the use of imputation as a complement to weighting by calibration. Contains guidelines for dealing with frame imperfections and coverage errors. Features worked examples throughout the text, using real data. The accessible style of "Estimation in Surveys with Nonresponse" will make this an invaluable tool for survey methodologists in national statistics agencies and private survey agencies. Researchers, teachers, and students of statistics, social sciences and economicswill benefit from the clear presentation and numerous examples.
The Handbook of Mathematical Economics aims to provide a definitive source, reference, and teaching supplement for the field of mathematical economics. It surveys, as of the late 1970's the state of the art of mathematical economics. This is a constantly developing field and all authors were invited to review and to appraise the current status and recent developments in their presentations. In addition to its use as a reference, it is intended that this Handbook will assist researchers and students working in one branch of mathematical economics to become acquainted with other branches of this field. The emphasis of this fourth volume of the Handbook of Mathematical Economics is on choice under uncertainty, general equilibrium analysis under conditions of uncertainty, economies with an infinite number of consumers or commodities, and dynamical systems. The book thus reflects some of the ideas that have been most influential in mathematical economics since the appearance of the first three volumes of the Handbook. Researchers, students, economists and mathematicians will all find this Handbook to be an indispensable reference source. It surveys the entire field of mathematical economics, critically reviewing recent developments. The chapters (which can be read independently) are written at an advanced level suitable for professional, teaching and graduate-level use. For more information on the Handbooks in Economics series,
please see our home page on http:
//www.elsevier.nl/locate/hes
Separation of signal from noise is the most fundamental problem in data analysis, and arises in many fields, for example, signal processing, econometrics, acturial science, and geostatistics. This book introduces the local regression method in univariate and multivariate settings, and extensions to local likelihood and density estimation. Basic theoretical results and diagnostic tools such as cross validation are introduced along the way. Examples illustrate the implementation of the methods using the LOCFIT software. |
You may like...
Multilingual Speech Processing
Tanja Schultz, Katrin Kirchhoff
Hardcover
R1,823
Discovery Miles 18 230
Sounds of the Pandemic - Accounts…
Maurizio Agamennone, Daniele Palma, …
Paperback
R1,312
Discovery Miles 13 120
JavaScript for Sound Artists - Learn to…
William Turner, Steve Leonard
Hardcover
R3,798
Discovery Miles 37 980
|