![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Non-Parametric Statistical Diagnosis
Productivity growth is a keyword for sustainable economic growth in a knowledge-based society. There has been significant methodological development in the literature on productivity and efficiency analysis, e.g. SFA (Stochastic Frontier Analysis) and DEA (Data Envelopment Analysis). All these methodological developments should be matched with applications in order to provide practical implications for private and public decision-makers. This volume provides a collection of up-to-date and new applications of productivity and efficiency analysis. In particular, the case studies cover various economic issues in the Asia-Pacific region. The authors analyze the performance of manufacturing firms, banks, venture capital, broadcasting firms, as well as the issues of efficiency in the education sector, regional development, and defense industry. These case studies will shed light on the potential contribution of productivity and efficiency analysis to the enhancement of economic performance.
World-renowned experts in spatial statistics and spatial econometrics present the latest advances in specification and estimation of spatial econometric models. This includes information on the development of tools and software, and various applications. The text introduces new tests and estimators for spatial regression models, including discrete choice and simultaneous equation models. The performance of techniques is demonstrated through simulation results and a wide array of applications related to economic growth, international trade, knowledge externalities, population-employment dynamics, urban crime, land use, and environmental issues. An exciting new text for academics with a theoretical interest in spatial statistics and econometrics, and for practitioners looking for modern and up-to-date techniques.
This book examines non-Gaussian distributions. It addresses the causes and consequences of non-normality and time dependency in both asset returns and option prices. The book is written for non-mathematicians who want to model financial market prices so the emphasis throughout is on practice. There are abundant empirical illustrations of the models and techniques described, many of which could be equally applied to other financial time series.
This second edition sees the light three years after the first one: too short a time to feel seriously concerned to redesign the entire book, but sufficient to be challenged by the prospect of sharpening our investigation on the working of econometric dynamic models and to be inclined to change the title of the new edition by dropping the "Topics in" of the former edition. After considerable soul searching we agreed to include several results related to topics already covered, as well as additional sections devoted to new and sophisticated techniques, which hinge mostly on the latest research work on linear matrix polynomials by the second author. This explains the growth of chapter one and the deeper insight into representation theorems in the last chapter of the book. The role of the second chapter is that of providing a bridge between the mathematical techniques in the backstage and the econometric profiles in the forefront of dynamic modelling. For this purpose, we decided to add a new section where the reader can find the stochastic rationale of vector autoregressive specifications in econometrics. The third (and last) chapter improves on that of the first edition by re- ing the fruits of the thorough analytic equipment previously drawn up."
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa,l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
This text provides a new approach to the subject, including a comprehensive survey of novel theoretical approaches, methods, and models used in macroeconomics and macroeconometrics. The book gives extensive insight into economic policy, incorporates a strong international perspective, and offers a broad historical perspective.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled.
The determinants of yield curve dynamics have been thoroughly discussed in finance models. However, little can be said about the macroeconomic factors behind the movements of short- and long-term interest rates as well as the risk compensation demanded by financial investors. By taking on a macro-finance perspective, the book's approach explicitly acknowledges the close feedback between monetary policy, the macroeconomy and financial conditions. Both theoretical and empirical models are applied in order to get a profound understanding of the interlinkages between economic activity, the conduct of monetary policy and the underlying macroeconomic factors of bond price movements. Moreover, the book identifies a broad risk-taking channel of monetary transmission which allows a reassessment of the role of financial constraints; it enables policy makers to develop new guidelines for monetary policy and for financial supervision of how to cope with evolving financial imbalances.
On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments of Statistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes in the use of this system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."
Economists, psychologists, and marketers are interested in determining the monetary value people place on non-market goods for a variety of reasons: to carry out cost-benefit analysis, to determine the welfare effects of technological innovation or public policy, to forecast new product success, and to understand individual and consumer behavior. Unfortunately, many currently available techniques for eliciting individuals' values suffer from a serious problem in that they involve asking individuals hypothetical questions about intended behavior. Experimental auctions circumvent this problem because they involve individuals exchanging real money for real goods in an active market. This represents a promising means for eliciting non-market values. Lusk and Shogren provide a comprehensive guide to the theory and practice of experimental auctions. It will be a valuable resource to graduate students, practitioners and researchers concerned with the design and utilization of experimental auctions in applied economic and marketing research.
Economists, psychologists, and marketers are interested in determining the monetary value people place on non-market goods for a variety of reasons: to carry out cost-benefit analysis, to determine the welfare effects of technological innovation or public policy, to forecast new product success, and to understand individual and consumer behavior. Unfortunately, many currently available techniques for eliciting individuals' values suffer from a serious problem in that they involve asking individuals hypothetical questions about intended behavior. Experimental auctions circumvent this problem because they involve individuals exchanging real money for real goods in an active market. This represents a promising means for eliciting non-market values. Lusk and Shogren provide a comprehensive guide to the theory and practice of experimental auctions. It will be a valuable resource to graduate students, practitioners and researchers concerned with the design and utilization of experimental auctions in applied economic and marketing research.
This book presents the state of the art in multilevel analysis, with an emphasis on more advanced topics. These topics are discussed conceptually, analyzed mathematically, and illustrated by empirical examples. Multilevel analysis is the statistical analysis of hierarchically and non-hierarchically nested data. The simplest example is clustered data, such as a sample of students clustered within schools. Multilevel data are especially prevalent in the social and behavioral sciences and in the biomedical sciences. The chapter authors are all leading experts in the field. Given the omnipresence of multilevel data in the social, behavioral, and biomedical sciences, this book is essential for empirical researchers in these fields.
This book contains some of the results from the research project "Demand for Food in the Nordic Countries," which was initiated in 1988 by Professor Olof Bolin of the Agricultural University in Ultuna, Sweden and by Professor Karl Iohan Weckman, of the University of Helsinki, Finland. A pilot study was carried out by Bengt Assarsson, which in 1989 led to a successful application for a research grant from the NKJ (The Nordic Contact Body for Agricultural Research) through the national research councils for agricultural research in Denmark, Finland, Norway and Sweden. We are very grateful to Olof Bolin and Karl Iohan Weckman, without whom this project would not have come about, and to the national research councils in the Nordic countries for the generous financial support we have received for this project. We have received comments and suggestions from many colleagues, and this has improved our work substantially. At the start of the project a reference group was formed, consisting of Professor Olof Bolin, Professor Anders Klevmarken, Agr. lie. Gert Aage Nielsen, Professor Karl Iohan Weckman and Cando oecon. Per Halvor Vale. Gert Aage Nielsen left the group early in the project for a position in Landbanken, and was replaced by Professor Lars Otto, while Per Halvor Vale soon joined the research staff. The reference group has given us useful suggestions and encouraged us in our work. Weare very grateful to them.
Use of information is basic to economic theory in two ways. As a basis for optimization, it is central to all normative hypotheses used in eco nomics, but in decision-making situations it has stochastic and evolution ary aspects that are more dynamic and hence more fundamental. This book provides an illustrative survey of the use of information in econom ics and other decision sciences. Since this area is one of the most active fields of research in modern times, it is not possible to be definitive on all aspects of the issues involved. However questions that appear to be most important in this author's view are emphasized in many cases, without drawing any definite conclusions. It is hoped that these questions would provoke new interest for those beginning researchers in the field who are currently most active. Various classifications of information structures and their relevance for optimal decision-making in a stochastic environment are analyzed in some detail. Specifically the following areas are illustrated in its analytic aspects: 1. Stochastic optimization in linear economic models, 2. Stochastic models in dynamic economics with problems of time-inc- sistency, causality and estimation, 3. Optimal output-inventory decisions in stochastic markets, 4. Minimax policies in portfolio theory, 5. Methods of stochastic control and differential games, and 6. Adaptive information structures in decision models in economics and the theory of economic policy."
This is the third book of three volumes containing edited versions of papers and a commentary presented at the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarise and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
This book is based on an international conference organised by the Applied Econo- metric Association (AEA) on International Macroeconomic Modelling which was held in Brussels at the Commission of the European Communities in December 1983. On behalf of the Applied Econometric Association, we would like to extend our thanks to all participants and contributors. This conference would not have been possible without the cooperation and support of the Commission of the European Economic Communities and of its Directorate General for Economics and Financial Affairs (DGII) staff, in particular M. Emerson, A. Dramais, and also H. Serbat of the Paris Chamber of Commerce and Industry. Our thanks go also to J.P. Ancot for his constructive comments concerning the structure of this book. We are grateful to M. Russo, R. Maldague and Y. Ullmo for opening the con- ference with their stimulating review and comments on the use of international macroeconomic models; and to R. Bird, A.M. Costa, A. Crockett, H. Guitton, J.C. Milleron, J. Paelinck, J. Waelbroeck for chairing the scientific sessions. P. Artus F. Gagey O. Guvenen vi INTRODUCTION The main focus of this book is to present recent developments in the construction and use of international macroeconometric models. Four main aspects are selected: (i) analysis of trade linkages and exchange rate determination; (ii) modelling and simulating the international economy; (iii) international policy coordination; (iv) the use of international macroeconomic models.
Measuring productive efficiency for nonprofit organizations has posed a great challenge to applied researchers today. The problem has many facets and diverse implications for a number of disciplines such as economics, applied statistics, management science and information theory. This monograph discusses four major areas, which emphasize the applied economic and econometric as. pects of the production frontier analysis: A. Stochastic frontier theory, B. Data envelopment analysis, C. Clustering and estimation theory, D. Economic and managerial applications Besides containing an up-to-date survey of the mos. t recent developments in the field, the monograph presents several new results and theorems from my own research. These include but are not limited to the following: (1) interface with parametric theory, (2) minimax and robust concepts of production frontier, (3) game-theoretic extension of the Farrell and Johansen models, (4) optimal clustering techniques for data envelopment analysis and (5) the dynamic and stochastic generalizations of the efficiency frontier at the micro and macro levels. In my research work in this field I have received great support and inspiration from Professor Abraham Charnes of the University of Texas at Austin, who has basically founded the technique of data envelopment analysis, developed it and is still expanding it. My interactions with him have been most fruitful and productive. I am deeply grateful to him. Finally, I must record my deep appreciation to my wife and two children for their loving and enduring support. But for their support this work would not have been completed.
In the autumn of 1961 Jan Salomon ('Mars') Cramer was appointed to the newly established chair of econometrics at the University of Amsterdam. This volume is published to commemorate this event. It is well-known how much econometrics has developed over the period under consideration, the 25 years that elapsed between 1961 and 1986. This is specifically true for the areas in which Cramer has been actively interested. We mention the theory and measurement of consumer behaviour; money and income; regression, correla tion and forecasting. In the present volume this development will be high lighted. Sixteen contributions have been sollicited from scholars all over the world who have belonged to the circle of academic friends of Cramer for a shorter or longer part of the period of 25 years. The contributions fall broadly speaking into the four areas mentioned above. Theory and measurement of consumer behaviour is represented by four papers, whereas a fifth paper deals with a related area. Richard Blundell and Costas Meghir devote a paper to the estimation of Engel curves. They apply a discrete choice model to British (individual) data from the Family Expenditure Survey 1981. Their aim is to assess the impact of individual characteristics such as income, demographic structure, location, wages and prices on commodity expenditure."
This work grew out of a series of investigations begun by the authors in 1980 and 1981. Specifically the authors pursued two lines of inquiry. First, to advance the state of the theoretical lit- erature to better explain the crises of liberalization which seemed to be afflicting the third world in general and Latin America in particular. To do this, several different kinds of models were in- vestigated and adapted. These are presented in Chapters 2, 3 and 5. Secondly an analysis of the empirical evidence was conducted in order to gain insight into the processes that were thought to be occurring and the theoretical models that were being developed. Some of this work appears in Chapters 3, 4, 5 and 6. Other work by the authors on these issues has been published elsewhere and is referenced herein. There are a great many people whose work and whose com- ments have influenced this work. We would like to especially thank Guillermo Calvo, Michael Connolly, Sebastian Edwards, Roque Fernandez, Michael Darby, Robert Clower, Neil Wallace, John Kareken, Paul McNelis, Jeffrey Nugent, Jaime Marquez, Lee Ohanian, Leroy Laney, Jorge Braga de Macedo, Dale Henderson, vii Matthew Canzoneiri, Arthur Laffer, Marc Miles, and George Von Furstenberg whose ideas and comments gave rise to much of our work. We would like to thank Suh Lee for his assistance with the computations in Chapter 5.
Due to the ability to handle specific characteristics of economics and finance forecasting problems like e.g. non-linear relationships, behavioral changes, or knowledge-based domain segmentation, we have recently witnessed a phenomenal growth of the application of computational intelligence methodologies in this field. In this volume, Chen and Wang collected not just works on traditional computational intelligence approaches like fuzzy logic, neural networks, and genetic algorithms, but also examples for more recent technologies like e.g. rough sets, support vector machines, wavelets, or ant algorithms. After an introductory chapter with a structural description of all the methodologies, the subsequent parts describe novel applications of these to typical economics and finance problems like business forecasting, currency crisis discrimination, foreign exchange markets, or stock markets behavior.
This volume provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross-section and time-series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone. Over the last 30 years much work has been done: investigation of the properties of the applied estimators and test statistics, analysis of dynamic models and the effects of eventual measurement errors, etc. These are just some of the problems addressed by this work. In addition, some specific difficulties associated with the use of panel data, such as attrition, heterogeneity, selectivity bias, pseudo panels etc. have also been explored. The first objective of this book, which takes up Parts I and II, is to give as complete and up-to-date a presentation of these theoretical developments as possible. Part I is concerned with classical linear models and their extensions; Part II deals with nonlinear models and related issues: logit and probit models, latent variable models, incomplete panels and selectivity bias, and point processes. The second objective is to provide insights into the use of panel data in empirical studies. Since the beginning, interest in panel data has been empirically based, and over time has become increasingly important in applied economic studies. This is demonstrated by growing numbers of conferences and special issues of economic journals devoted to the subject. Part III deals with studies in several major fields of applied economics, such as labour and investment demand, labour supply, consumption, transitions on the labour market, and finance. The double emphasis of this book (theoretical and applied), together with the fact that all the chapters have been written by well-known specialists in the field, ensure that it will become a standard textbook for all those who are concerned with the use of panel data in econometrics, whether they are advanced students, professional economists or researchers.
This book introduces a new way to analyze multivariate data. The analysis of data based on multivariate spatial signs and ranks proceeds very much as does a tra- tional multivariate analysis relying on the assumption of multivariate normality: the L norm is just replaced by different L norms, observation vectors are replaced by 2 1 their(standardizedandcentered)spatial signsandranks, andso on.Themethodsare fairly ef?cient and robust, and no moment assumptions are needed. A uni?ed t- ory starting with the simple one-sample location problem and proceeding through the several-sample location problems to the general multivariate linear regression model and ?nally to the analysis of cluster-dependent data is presented. The material is divided into 14 chapters. Chapter 1 serves as a short introd- tion to the general ideas and strategies followed in the book. Chapter 2 introduces and discusses different types of parametric, nonparametric, and semiparametric s- tistical models used to analyze the multivariate data. Chapter 3 provides general descriptive tools to describe the properties of multivariate distributions and mul- variate datasets. Multivariate location and scatter functionals and statistics and their use is described in detail. Chapter 4 introduces the concepts of multivariate spatial sign, signed-rank, andrank, and shows their connectionto certain L objectivefunc- 1 tions. Also sign and rank covariance matrices are discussed carefully. The ?rst four chapters thus provide the necessary tools to understand the remaining part of the b
This book provides cutting-edge research results and application experiencesfrom researchers and practitioners in multiple criteria decision making areas. It consists of three parts: MCDM Foundation and Theory, MCDM Methodology, and MCDM Applications. In Part I, it covers the historical MCDM development, the influence of MCDM on technology, society and policy, Pareto optimization, and analytical hierarchy process. In Part II, the book presents different MCDM algorithms based on techniques of robust estimating, evolutionary multiobjective optimization, Choquet integrals, and genetic search. In Part III, this book demonstrates a variety of MCDM applications, including project management, financial investment, credit risk analysis, railway transportation, online advertising, transport infrastructure, environmental pollution, chemical industry, and regional economy. The 17 papers of the book have been selected out of the 121 accepted papers at the 20th International Conference on Multiple Criteria Decision Making "New State of MCDM in 21st Century," held at Chengdu, China, in 2009. The 35 contributors of these papers stem from 10 countries." |
You may like...
Canadian Pictures, Drawn With Pen and…
John Douglas Sutherland Campb Argyll
Hardcover
R832
Discovery Miles 8 320
Living With the Winnebagos - Experiences…
John, H. Kinzie, Andrew Jackson Turner, …
Hardcover
R897
Discovery Miles 8 970
|