![]() |
![]() |
Your cart is empty |
||
Showing 1 - 13 of 13 matches in All Departments
Computationally intensive methods have become widely used both for statistical inference and for exploratory analyses of data. The methods of computational statistics involve resampling, partitioning, and multiple transformations of a dataset. They may also make use of randomly generated artificial data. Implementation of these methods often requires advanced techniques in numerical analysis, so there is a close connection between computational statistics and statistical computing. This book describes techniques used in computational statistics, and addresses some areas of application of computationally intensive methods, such as density estimation, identification of structure in data, and model building. Although methods of statistical computing are not emphasized in this book, numerical techniques for transformations, for function approximation, and for optimization are explained in the context of the statistical methods. The book includes exercises, some with solutions. The book can be used as a text or supplementary text for various courses in modern statistics at the advanced undergraduate or graduate level, and it can also be used as a reference for statisticians who use computationally-intensive methods of analysis. Although some familiarity with probability and statistics is assumed, the book reviews basic methods of inference, and so is largely self-contained. James Gentle is University Professor of Computational Statistics at George Mason University. He is a Fellow of the American Statistical Association and a member of the International Statistical Institute. He has held several national offices in the American Statistical Association and has served as associate editor for journals of the ASA as well as for other journals in statistics and computing. He is the author of Random Number Generation and Monte Carlo Methods and Numerical Linear Algebra for Statistical Applications.
Monte Carlo simulation has become one of the most important tools in all fields of science. Simulation methodology relies on a good source of numbers that appear to be random. These "pseudorandom" numbers must pass statistical tests just as random samples would. Methods for producing pseudorandom numbers and transforming those numbers to simulate samples from various distributions are among the most important topics in statistical computing. This book surveys techniques of random number generation and the use of random numbers in Monte Carlo simulation. The book covers basic principles, as well as newer methods such as parallel random number generation, nonlinear congruential generators, quasi Monte Carlo methods, and Markov chain Monte Carlo. The best methods for generating random variates from the standard distributions are presented, but also general techniques useful in more complicated models and in novel settings are described. The emphasis throughout the book is on practical methods that work well in current computing environments. The book includes exercises and can be used as a test or supplementary text for various courses in modern statistics. It could serve as the primary test for a specialized course in statistical computing, or as a supplementary text for a course in computational statistics and other areas of modern statistics that rely on simulation. The book, which covers recent developments in the field, could also serve as a useful reference for practitioners. Although some familiarity with probability and statistics is assumed, the book is accessible to a broad audience. The second edition is approximately 50% longer than the first edition. It includes advances in methods for parallel random number generation, universal methods for generation of nonuniform variates, perfect sampling, and software for random number generation. The material on testing of random number generators has been expanded to include a discussion of newer software for testing, as well as more discussion about the tests themselves. The second edition has more discussion of applications of Monte Carlo methods in various fields, including physics and computational finance. James Gentle is University Professor of Computational Statistics at George Mason University. During a thirteen-year hiatus from academic work before joining George Mason, he was director of research and design at the world's largest independent producer of Fortran and C general-purpose scientific software libraries. These libraries implement several random number generators, and are widely used in Monte Carlo studies. He is a Fellow of the American Statistical Association and a member of the International Statistical Institute. He has held several national offices in the American Statistical Association and has served as an associate editor for journals of the ASA as well as for other journals in statistics and computing. Recent activities include serving as program director of statistics at the National Science Foundation and as research fellow at the Bureau of Labor Statistics.
Any financial asset that is openly traded has a market price. Except for extreme market conditions, market price may be more or less than a fair value. Fair value is likely to be some complicated function of the current intrinsic value of tangible or intangible assets underlying the claim and our assessment of the characteristics of the underlying assets with respect to the expected rate of growth, future dividends, volatility, and other relevant market factors. Some of these factors that affect the price can be measured at the time of a transaction with reasonably high accuracy. Most factors, however, relate to expectations about the future and to subjective issues, such as current management, corporate policies and market environment, that could affect the future financial performance of the underlying assets. Models are thus needed to describe the stochastic factors and environment, and their implementations inevitably require computational finance tools.
Computational inference is based on an approach to statistical methods that uses modern computational power to simulate distributional properties of estimators and test statistics. This book describes computationally intensive statistical methods in a unified presentation, emphasizing techniques, such as the PDF decomposition, that arise in a wide range of methods.
Accurate and efficient computer algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors. Regardless of the software system used, the book describes and gives examples of the use of modern computer software for numerical linear algebra. It begins with a discussion of the basics of numerical computations, and then describes the relevant properties of matrix inverses, factorisations, matrix and vector norms, and other topics in linear algebra. The book is essentially self- contained, with the topics addressed constituting the essential material for an introductory course in statistical computing. Numerous exercises allow the text to be used for a first course in statistical computing or as supplementary text for various courses that emphasise computations.
Matrix algebra is one of the most important areas of mathematics for data analysis and for statistical theory. This much-needed work presents the relevant aspects of the theory of matrix algebra for applications in statistics. It moves on to consider the various types of matrices encountered in statistics, such as projection matrices and positive definite matrices, and describes the special properties of those matrices. Finally, it covers numerical linear algebra, beginning with a discussion of the basics of numerical computations, and following up with accurate and efficient algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors.
Any financial asset that is openly traded has a market price. Except for extreme market conditions, market price may be more or less than a "fair" value. Fair value is likely to be some complicated function of the current intrinsic value of tangible or intangible assets underlying the claim and our assessment of the characteristics of the underlying assets with respect to the expected rate of growth, future dividends, volatility, and other relevant market factors. Some of these factors that affect the price can be measured at the time of a transaction with reasonably high accuracy. Most factors, however, relate to expectations about the future and to subjective issues, such as current management, corporate policies and market environment, that could affect the future financial performance of the underlying assets. Models are thus needed to describe the stochastic factors and environment, and their implementations inevitably require computational finance tools.
Accurate and efficient computer algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors. Regardless of the software system used, the book describes and gives examples of the use of modern computer software for numerical linear algebra. It begins with a discussion of the basics of numerical computations, and then describes the relevant properties of matrix inverses, factorisations, matrix and vector norms, and other topics in linear algebra. The book is essentially self- contained, with the topics addressed constituting the essential material for an introductory course in statistical computing. Numerous exercises allow the text to be used for a first course in statistical computing or as supplementary text for various courses that emphasise computations.
Monte Carlo simulation has become one of the most important tools in all fields of science. Simulation methodology relies on a good source of numbers that appear to be random. These "pseudorandom" numbers must pass statistical tests just as random samples would. Methods for producing pseudorandom numbers and transforming those numbers to simulate samples from various distributions are among the most important topics in statistical computing. This book surveys techniques of random number generation and the use of random numbers in Monte Carlo simulation. The book covers basic principles, as well as newer methods such as parallel random number generation, nonlinear congruential generators, quasi Monte Carlo methods, and Markov chain Monte Carlo. The best methods for generating random variates from the standard distributions are presented, but also general techniques useful in more complicated models and in novel settings are described. The emphasis throughout the book is on practical methods that work well in current computing environments. The book includes exercises and can be used as a test or supplementary text for various courses in modern statistics. It could serve as the primary test for a specialized course in statistical computing, or as a supplementary text for a course in computational statistics and other areas of modern statistics that rely on simulation. The book, which covers recent developments in the field, could also serve as a useful reference for practitioners. Although some familiarity with probability and statistics is assumed, the book is accessible to a broad audience. The second edition is approximately 50% longer than the first edition. It includes advances in methods for parallel random number generation, universal methods for generation of nonuniform variates, perfect sampling, and software for random number generation.
Will provide a more elementary introduction to these topics than other books available; Gentle is the author of two other Springer books
In this book the authors have assembled the best techniques from a great variety of sources, establishing a benchmark for the field of statistical computing. ---Mathematics of Computation . The text is highly readable and well illustrated with examples. The reader who intends to take a hand in designing his own regression and multivariate packages will find a storehouse of information and a valuable resource in the field of statistical computing.
Matrix algebra is one of the most important areas of mathematics for data analysis and for statistical theory. This much-needed work presents the relevant aspects of the theory of matrix algebra for applications in statistics. It moves on to consider the various types of matrices encountered in statistics, such as projection matrices and positive definite matrices, and describes the special properties of those matrices. Finally, it covers numerical linear algebra, beginning with a discussion of the basics of numerical computations, and following up with accurate and efficient algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors.
Computational inference is based on an approach to statistical methods that uses modern computational power to simulate distributional properties of estimators and test statistics. This book describes computationally intensive statistical methods in a unified presentation, emphasizing techniques, such as the PDF decomposition, that arise in a wide range of methods.
|
![]() ![]() You may like...
Discovering Daniel - Finding Our Hope In…
Amir Tsarfati, Rick Yohn
Paperback
|