Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 6 of 6 matches in All Departments
This book examines advanced Bayesian computational methods. It presents methods for sampling from posterior distributions and discusses how to compute posterior quantities of interest using Markov chain Monte Carlo (MCMC) samples. This book examines each of these issues in detail and heavily focuses on computing various posterior quantities of interest from a given MCMC sample. Several topics are addressed, including techniques for MCMC sampling, Monte Carlo methods for estimation of posterior quantities, improving simulation accuracy, marginal posterior density estimation, estimation of normalizing constants, constrained parameter problems, highest posterior density interval calculations, computation of posterior modes, and posterior computations for proportional hazards models and Dirichlet process models. The authors also discuss computions involving model comparisons, including both nested and non-nested models, marginal likelihood methods, ratios of normalizing constants, Bayes factors, the Savage-Dickey density ratio, Stochastic Search Variable Selection, Bayesian Model Averaging, the reverse jump algorithm, and model adequacy using predictive and latent residual approaches. The book presents an equal mixture of theory and applications involving real data. The book is intended as a graduate textbook or a reference book for a one semester course at the advanced masters or Ph.D. level. It would also serve as a useful reference book for applied or theoretical researchers as well as practitioners. Ming-Hui Chen is Associate Professor of Mathematical Sciences at Worcester Polytechnic Institute, Qu-Man Shao is Assistant Professor of Mathematics at the University of Oregon. Joseph G. Ibrahim is Associate Professor of Biostatistics at the Harvard School of Public Health and Dana-Farber Cancer Institute.
Since its introduction in 1972, Stein's method has offered a completely novel way of evaluating the quality of normal approximations. Through its characterizing equation approach, it is able to provide approximation error bounds in a wide variety of situations, even in the presence of complicated dependence. Use of the method thus opens the door to the analysis of random phenomena arising in areas including statistics, physics, and molecular biology. Though Stein's method for normal approximation is now mature, the literature has so far lacked a complete self contained treatment. This volume contains thorough coverage of the method's fundamentals, includes a large number of recent developments in both theory and applications, and will help accelerate the appreciation, understanding, and use of Stein's method by providing the reader with the tools needed to apply it in new situations. It addresses researchers as well as graduate students in Probability, Statistics and Combinatorics.
Since its introduction in 1972, Stein's method has offered a completely novel way of evaluating the quality of normal approximations. Through its characterizing equation approach, it is able to provide approximation error bounds in a wide variety of situations, even in the presence of complicated dependence. Use of the method thus opens the door to the analysis of random phenomena arising in areas including statistics, physics, and molecular biology. Though Stein's method for normal approximation is now mature, the literature has so far lacked a complete self contained treatment. This volume contains thorough coverage of the method's fundamentals, includes a large number of recent developments in both theory and applications, and will help accelerate the appreciation, understanding, and use of Stein's method by providing the reader with the tools needed to apply it in new situations. It addresses researchers as well as graduate students in Probability, Statistics and Combinatorics.
Dealing with methods for sampling from posterior distributions and how to compute posterior quantities of interest using Markov chain Monte Carlo (MCMC) samples, this book addresses such topics as improving simulation accuracy, marginal posterior density estimation, estimation of normalizing constants, constrained parameter problems, highest posterior density interval calculations, computation of posterior modes, and posterior computations for proportional hazards models and Dirichlet process models. The authors also discuss model comparisons, including both nested and non-nested models, marginal likelihood methods, ratios of normalizing constants, Bayes factors, the Savage-Dickey density ratio, Stochastic Search Variable Selection, Bayesian Model Averaging, the reverse jump algorithm, and model adequacy using predictive and latent residual approaches. The book presents an equal mixture of theory and applications involving real data, and is intended as a graduate textbook or a reference book for a one-semester course at the advanced masters or Ph.D. level. It will also serve as a useful reference for applied or theoretical researchers as well as practitioners.
Self-normalized processes are of common occurrence in probabilistic and statistical studies. A prototypical example is Student's t-statistic introduced in 1908 by Gosset, whose portrait is on the front cover. Due to the highly non-linear nature of these processes, the theory experienced a long period of slow development. In recent years there have been a number of important advances in the theory and applications of self-normalized processes. Some of these developments are closely linked to the study of central limit theorems, which imply that self-normalized processes are approximate pivots for statistical inference. The present volume covers recent developments in the area, including self-normalized large and moderate deviations, and laws of the iterated logarithms for self-normalized martingales. This is the first book that systematically treats the theory and applications of self-normalization.
Self-normalized processes are of common occurrence in probabilistic and statistical studies. A prototypical example is Student's t-statistic introduced in 1908 by Gosset, whose portrait is on the front cover. Due to the highly non-linear nature of these processes, the theory experienced a long period of slow development. In recent years there have been a number of important advances in the theory and applications of self-normalized processes. Some of these developments are closely linked to the study of central limit theorems, which imply that self-normalized processes are approximate pivots for statistical inference. The present volume covers recent developments in the area, including self-normalized large and moderate deviations, and laws of the iterated logarithms for self-normalized martingales. This is the first book that systematically treats the theory and applications of self-normalization.
|
You may like...Not available
|