Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 4 of 4 matches in All Departments
This book aims to provide a systematic and comprehensive treatment of recent developments in efficiency analysis in order to overcome these drawbacks. In Part I of the book (Methodology), we introduce a complete set of tools for measuring the efficiency of firms and for explaining the observed efficiency differentials. In Part II of the book (Applications), we propose three empirical illustrations taken from different economic fields: scientific research, mutual funds industry and the insurance sector. This book has been specifically designed for applied economists who have an interest in the advantages of traditional nonparametric methods (DEA/FDH) for efficiency analysis, but are sceptical about adopting them because of the drawbacks they present. They will find a complete and up to date presentation of the advances made in nonparametric frontier analysis able to overcome most of the drawbacks of traditional methods.
Providing a systematic and comprehensive treatment of recent developments in efficiency analysis, this book makes available an intuitive yet rigorous presentation of advanced nonparametric and robust methods, with applications for the analysis of economies of scale and scope, trade-offs in production and service activities, and explanations of efficiency differentials.
The computer has created new fields in statistic. Numerical and statistical problems that were untackable five to ten years ago can now be computed even on portable personal computers. A computer intensive task is for example the numerical calculation of posterior distributions in Bayesian analysis. The Bootstrap and image analysis are two other fields spawned by the almost unlimited computing power. It is not only the computing power through that has revolutionized statistics, the graphical interactiveness on modern statistical environments has given us the possibility for deeper insight into our data. On November 21 ,22 1991 a conference on computer Intensive Methods in Statistics has been organized at the Universite Catholique de Louvain, Louvain-La-Neuve, Belgium. The organizers were Jan Beirlant (Katholieke Universiteit Leuven), Wolfgang Hardie (Humboldt-Universitat zu Berlin) and Leopold Simar (Universite Catholique de Louvain and Facultes Universitaires Saint-Louis). The meeting was the Xllth in the series of the Rencontre Franco-Beige des Statisticians. Following this tradition both theoretical statistical results and practical contributions of this active field of statistical research were presented. The four topics that have been treated in more detail were: Bayesian Computing; Interfacing Statistics and Computers; Image Analysis; Resampling Methods. Selected and refereed papers have been edited and collected for this book. 1) Bayesian Computing.
Estimation and Inference in Nonparametric Frontier Models provides a thorough examination of this topic for students and researchers alike. While nonparametric estimators are widely used to estimate the productive efficiency of firms and other organizations, it is often done without any attempt to make statistical inference. Recent work has provided statistical properties of these estimators and methods for making statistical inference has established a link between frontier estimation and extreme value theory. New estimators that avoid many of the problems inherent with traditional efficiency estimators have been developed and these new estimators are robust with respect to outliers and avoid the problem of dimensionality. In addition, statistical properties, including asymptotic distributions, of the new estimators are uncovered. Finally, the authors show several approaches for introducing environmental variables into production models.
|
You may like...
|