![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa,l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
This text provides a new approach to the subject, including a comprehensive survey of novel theoretical approaches, methods, and models used in macroeconomics and macroeconometrics. The book gives extensive insight into economic policy, incorporates a strong international perspective, and offers a broad historical perspective.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled.
The determinants of yield curve dynamics have been thoroughly discussed in finance models. However, little can be said about the macroeconomic factors behind the movements of short- and long-term interest rates as well as the risk compensation demanded by financial investors. By taking on a macro-finance perspective, the book's approach explicitly acknowledges the close feedback between monetary policy, the macroeconomy and financial conditions. Both theoretical and empirical models are applied in order to get a profound understanding of the interlinkages between economic activity, the conduct of monetary policy and the underlying macroeconomic factors of bond price movements. Moreover, the book identifies a broad risk-taking channel of monetary transmission which allows a reassessment of the role of financial constraints; it enables policy makers to develop new guidelines for monetary policy and for financial supervision of how to cope with evolving financial imbalances.
On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments of Statistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes in the use of this system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."
Economists, psychologists, and marketers are interested in determining the monetary value people place on non-market goods for a variety of reasons: to carry out cost-benefit analysis, to determine the welfare effects of technological innovation or public policy, to forecast new product success, and to understand individual and consumer behavior. Unfortunately, many currently available techniques for eliciting individuals' values suffer from a serious problem in that they involve asking individuals hypothetical questions about intended behavior. Experimental auctions circumvent this problem because they involve individuals exchanging real money for real goods in an active market. This represents a promising means for eliciting non-market values. Lusk and Shogren provide a comprehensive guide to the theory and practice of experimental auctions. It will be a valuable resource to graduate students, practitioners and researchers concerned with the design and utilization of experimental auctions in applied economic and marketing research.
Economists, psychologists, and marketers are interested in determining the monetary value people place on non-market goods for a variety of reasons: to carry out cost-benefit analysis, to determine the welfare effects of technological innovation or public policy, to forecast new product success, and to understand individual and consumer behavior. Unfortunately, many currently available techniques for eliciting individuals' values suffer from a serious problem in that they involve asking individuals hypothetical questions about intended behavior. Experimental auctions circumvent this problem because they involve individuals exchanging real money for real goods in an active market. This represents a promising means for eliciting non-market values. Lusk and Shogren provide a comprehensive guide to the theory and practice of experimental auctions. It will be a valuable resource to graduate students, practitioners and researchers concerned with the design and utilization of experimental auctions in applied economic and marketing research.
This book presents the state of the art in multilevel analysis, with an emphasis on more advanced topics. These topics are discussed conceptually, analyzed mathematically, and illustrated by empirical examples. Multilevel analysis is the statistical analysis of hierarchically and non-hierarchically nested data. The simplest example is clustered data, such as a sample of students clustered within schools. Multilevel data are especially prevalent in the social and behavioral sciences and in the biomedical sciences. The chapter authors are all leading experts in the field. Given the omnipresence of multilevel data in the social, behavioral, and biomedical sciences, this book is essential for empirical researchers in these fields.
This book contains some of the results from the research project "Demand for Food in the Nordic Countries," which was initiated in 1988 by Professor Olof Bolin of the Agricultural University in Ultuna, Sweden and by Professor Karl Iohan Weckman, of the University of Helsinki, Finland. A pilot study was carried out by Bengt Assarsson, which in 1989 led to a successful application for a research grant from the NKJ (The Nordic Contact Body for Agricultural Research) through the national research councils for agricultural research in Denmark, Finland, Norway and Sweden. We are very grateful to Olof Bolin and Karl Iohan Weckman, without whom this project would not have come about, and to the national research councils in the Nordic countries for the generous financial support we have received for this project. We have received comments and suggestions from many colleagues, and this has improved our work substantially. At the start of the project a reference group was formed, consisting of Professor Olof Bolin, Professor Anders Klevmarken, Agr. lie. Gert Aage Nielsen, Professor Karl Iohan Weckman and Cando oecon. Per Halvor Vale. Gert Aage Nielsen left the group early in the project for a position in Landbanken, and was replaced by Professor Lars Otto, while Per Halvor Vale soon joined the research staff. The reference group has given us useful suggestions and encouraged us in our work. Weare very grateful to them.
Use of information is basic to economic theory in two ways. As a basis for optimization, it is central to all normative hypotheses used in eco nomics, but in decision-making situations it has stochastic and evolution ary aspects that are more dynamic and hence more fundamental. This book provides an illustrative survey of the use of information in econom ics and other decision sciences. Since this area is one of the most active fields of research in modern times, it is not possible to be definitive on all aspects of the issues involved. However questions that appear to be most important in this author's view are emphasized in many cases, without drawing any definite conclusions. It is hoped that these questions would provoke new interest for those beginning researchers in the field who are currently most active. Various classifications of information structures and their relevance for optimal decision-making in a stochastic environment are analyzed in some detail. Specifically the following areas are illustrated in its analytic aspects: 1. Stochastic optimization in linear economic models, 2. Stochastic models in dynamic economics with problems of time-inc- sistency, causality and estimation, 3. Optimal output-inventory decisions in stochastic markets, 4. Minimax policies in portfolio theory, 5. Methods of stochastic control and differential games, and 6. Adaptive information structures in decision models in economics and the theory of economic policy."
This is the third book of three volumes containing edited versions of papers and a commentary presented at the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarise and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
This book is based on an international conference organised by the Applied Econo- metric Association (AEA) on International Macroeconomic Modelling which was held in Brussels at the Commission of the European Communities in December 1983. On behalf of the Applied Econometric Association, we would like to extend our thanks to all participants and contributors. This conference would not have been possible without the cooperation and support of the Commission of the European Economic Communities and of its Directorate General for Economics and Financial Affairs (DGII) staff, in particular M. Emerson, A. Dramais, and also H. Serbat of the Paris Chamber of Commerce and Industry. Our thanks go also to J.P. Ancot for his constructive comments concerning the structure of this book. We are grateful to M. Russo, R. Maldague and Y. Ullmo for opening the con- ference with their stimulating review and comments on the use of international macroeconomic models; and to R. Bird, A.M. Costa, A. Crockett, H. Guitton, J.C. Milleron, J. Paelinck, J. Waelbroeck for chairing the scientific sessions. P. Artus F. Gagey O. Guvenen vi INTRODUCTION The main focus of this book is to present recent developments in the construction and use of international macroeconometric models. Four main aspects are selected: (i) analysis of trade linkages and exchange rate determination; (ii) modelling and simulating the international economy; (iii) international policy coordination; (iv) the use of international macroeconomic models.
Measuring productive efficiency for nonprofit organizations has posed a great challenge to applied researchers today. The problem has many facets and diverse implications for a number of disciplines such as economics, applied statistics, management science and information theory. This monograph discusses four major areas, which emphasize the applied economic and econometric as. pects of the production frontier analysis: A. Stochastic frontier theory, B. Data envelopment analysis, C. Clustering and estimation theory, D. Economic and managerial applications Besides containing an up-to-date survey of the mos. t recent developments in the field, the monograph presents several new results and theorems from my own research. These include but are not limited to the following: (1) interface with parametric theory, (2) minimax and robust concepts of production frontier, (3) game-theoretic extension of the Farrell and Johansen models, (4) optimal clustering techniques for data envelopment analysis and (5) the dynamic and stochastic generalizations of the efficiency frontier at the micro and macro levels. In my research work in this field I have received great support and inspiration from Professor Abraham Charnes of the University of Texas at Austin, who has basically founded the technique of data envelopment analysis, developed it and is still expanding it. My interactions with him have been most fruitful and productive. I am deeply grateful to him. Finally, I must record my deep appreciation to my wife and two children for their loving and enduring support. But for their support this work would not have been completed.
"It's the economy, stupid," as Democratic strategist James Carville
would say. After many years of study, Ray C. Fair has found that
the state of the economy has a dominant influence on national
elections. Just in time for the 2012 presidential election, this
new edition of his classic text, "Predicting Presidential Elections
and Other Things," provides us with a look into the likely future
of our nation's political landscape--but Fair doesn't stop there.
In the autumn of 1961 Jan Salomon ('Mars') Cramer was appointed to the newly established chair of econometrics at the University of Amsterdam. This volume is published to commemorate this event. It is well-known how much econometrics has developed over the period under consideration, the 25 years that elapsed between 1961 and 1986. This is specifically true for the areas in which Cramer has been actively interested. We mention the theory and measurement of consumer behaviour; money and income; regression, correla tion and forecasting. In the present volume this development will be high lighted. Sixteen contributions have been sollicited from scholars all over the world who have belonged to the circle of academic friends of Cramer for a shorter or longer part of the period of 25 years. The contributions fall broadly speaking into the four areas mentioned above. Theory and measurement of consumer behaviour is represented by four papers, whereas a fifth paper deals with a related area. Richard Blundell and Costas Meghir devote a paper to the estimation of Engel curves. They apply a discrete choice model to British (individual) data from the Family Expenditure Survey 1981. Their aim is to assess the impact of individual characteristics such as income, demographic structure, location, wages and prices on commodity expenditure."
This work grew out of a series of investigations begun by the authors in 1980 and 1981. Specifically the authors pursued two lines of inquiry. First, to advance the state of the theoretical lit- erature to better explain the crises of liberalization which seemed to be afflicting the third world in general and Latin America in particular. To do this, several different kinds of models were in- vestigated and adapted. These are presented in Chapters 2, 3 and 5. Secondly an analysis of the empirical evidence was conducted in order to gain insight into the processes that were thought to be occurring and the theoretical models that were being developed. Some of this work appears in Chapters 3, 4, 5 and 6. Other work by the authors on these issues has been published elsewhere and is referenced herein. There are a great many people whose work and whose com- ments have influenced this work. We would like to especially thank Guillermo Calvo, Michael Connolly, Sebastian Edwards, Roque Fernandez, Michael Darby, Robert Clower, Neil Wallace, John Kareken, Paul McNelis, Jeffrey Nugent, Jaime Marquez, Lee Ohanian, Leroy Laney, Jorge Braga de Macedo, Dale Henderson, vii Matthew Canzoneiri, Arthur Laffer, Marc Miles, and George Von Furstenberg whose ideas and comments gave rise to much of our work. We would like to thank Suh Lee for his assistance with the computations in Chapter 5.
Due to the ability to handle specific characteristics of economics and finance forecasting problems like e.g. non-linear relationships, behavioral changes, or knowledge-based domain segmentation, we have recently witnessed a phenomenal growth of the application of computational intelligence methodologies in this field. In this volume, Chen and Wang collected not just works on traditional computational intelligence approaches like fuzzy logic, neural networks, and genetic algorithms, but also examples for more recent technologies like e.g. rough sets, support vector machines, wavelets, or ant algorithms. After an introductory chapter with a structural description of all the methodologies, the subsequent parts describe novel applications of these to typical economics and finance problems like business forecasting, currency crisis discrimination, foreign exchange markets, or stock markets behavior.
This volume provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross-section and time-series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone. Over the last 30 years much work has been done: investigation of the properties of the applied estimators and test statistics, analysis of dynamic models and the effects of eventual measurement errors, etc. These are just some of the problems addressed by this work. In addition, some specific difficulties associated with the use of panel data, such as attrition, heterogeneity, selectivity bias, pseudo panels etc. have also been explored. The first objective of this book, which takes up Parts I and II, is to give as complete and up-to-date a presentation of these theoretical developments as possible. Part I is concerned with classical linear models and their extensions; Part II deals with nonlinear models and related issues: logit and probit models, latent variable models, incomplete panels and selectivity bias, and point processes. The second objective is to provide insights into the use of panel data in empirical studies. Since the beginning, interest in panel data has been empirically based, and over time has become increasingly important in applied economic studies. This is demonstrated by growing numbers of conferences and special issues of economic journals devoted to the subject. Part III deals with studies in several major fields of applied economics, such as labour and investment demand, labour supply, consumption, transitions on the labour market, and finance. The double emphasis of this book (theoretical and applied), together with the fact that all the chapters have been written by well-known specialists in the field, ensure that it will become a standard textbook for all those who are concerned with the use of panel data in econometrics, whether they are advanced students, professional economists or researchers.
Meta-Regression Analysis in Economics and Business is the first text devoted to the meta-regression analysis (MRA) of economics and business research. The book provides a comprehensive guide to conducting systematic reviews of empirical economics and business research, identifying and explaining the best practices of MRA, and highlighting its problems and pitfalls. These statistical techniques are illustrated using actual data from four published meta-analyses of business and economic research: the effects of unions on productivity, the employment effects of the minimum wage, the value of a statistical life and residential water demand elasticities. While it shares some features in common with these other disciplines, meta-analysis in economics and business faces its own particular challenges and types of research data. This volume guides new researchers from the beginning to the end, from the collection of research to publication of their research. This book will be of great interest to students and researchers in business, economics, marketing, management, and political science, as well as to policy makers.
This book introduces a new way to analyze multivariate data. The analysis of data based on multivariate spatial signs and ranks proceeds very much as does a tra- tional multivariate analysis relying on the assumption of multivariate normality: the L norm is just replaced by different L norms, observation vectors are replaced by 2 1 their(standardizedandcentered)spatial signsandranks, andso on.Themethodsare fairly ef?cient and robust, and no moment assumptions are needed. A uni?ed t- ory starting with the simple one-sample location problem and proceeding through the several-sample location problems to the general multivariate linear regression model and ?nally to the analysis of cluster-dependent data is presented. The material is divided into 14 chapters. Chapter 1 serves as a short introd- tion to the general ideas and strategies followed in the book. Chapter 2 introduces and discusses different types of parametric, nonparametric, and semiparametric s- tistical models used to analyze the multivariate data. Chapter 3 provides general descriptive tools to describe the properties of multivariate distributions and mul- variate datasets. Multivariate location and scatter functionals and statistics and their use is described in detail. Chapter 4 introduces the concepts of multivariate spatial sign, signed-rank, andrank, and shows their connectionto certain L objectivefunc- 1 tions. Also sign and rank covariance matrices are discussed carefully. The ?rst four chapters thus provide the necessary tools to understand the remaining part of the b
This book provides cutting-edge research results and application experiencesfrom researchers and practitioners in multiple criteria decision making areas. It consists of three parts: MCDM Foundation and Theory, MCDM Methodology, and MCDM Applications. In Part I, it covers the historical MCDM development, the influence of MCDM on technology, society and policy, Pareto optimization, and analytical hierarchy process. In Part II, the book presents different MCDM algorithms based on techniques of robust estimating, evolutionary multiobjective optimization, Choquet integrals, and genetic search. In Part III, this book demonstrates a variety of MCDM applications, including project management, financial investment, credit risk analysis, railway transportation, online advertising, transport infrastructure, environmental pollution, chemical industry, and regional economy. The 17 papers of the book have been selected out of the 121 accepted papers at the 20th International Conference on Multiple Criteria Decision Making "New State of MCDM in 21st Century," held at Chengdu, China, in 2009. The 35 contributors of these papers stem from 10 countries."
The Analytic Network Process (ANP) developed by Thomas Saaty in his work on multicriteria decision making applies network structures with dependence and feedback to complex decision making. This book is a selection of applications of ANP to economic, social and political decisions, and also to technological design. The chapters comprise contributions of scholars, consultants and people concerned about the outcome of certain important decisions who applied the Analytic Network Process to determine the best outcome for each decision from among several potential outcomes. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate, The ANP offers economists a considerably different approach for dealing with economic problems than the usual quantitative models used. The ANP approach is based on absolute scales used to represent pairwise comparison judgments in the context of dominance with respect to a property shared by the homogeneous elements being compared. How much or how many times more does A dominate B with respect to property P? Actually people are able to answer this question by using words to indicate intensity of dominance that all of us are equipped biologically to do all the time (equal, moderate, strong, very strong, and extreme) whose conversion to numbers, validation and extension to inhomogeneous elements form the foundation of the AHP/ANP. Numerous applications of the ANP have been made to economic problems, among which prediction of the turn-around dates for the US economy in the early 1990 s and again in 2001 whose accuracy and validity were both confirmed later in the news. They were based on the process of comparisons of mostly intangible factors rather than on financial, employment and other data and statistics.
The small sample properties of estimators and tests are frequently too complex to be useful or are unknown. Much econometric theory is therefore developed for very large or asymptotic samples where it is assumed that the behaviour of estimators and tests will adequately represent their properties in small samples. Refined asymptotic methods adopt an intermediate position by providing improved approximations to small sample behaviour using asymptotic expansions. Dedicated to the memory of Michael Magdalinos, whose work is a major contribution to this area, this book contains chapters directly concerned with refined asymptotic methods. In addition, there are chapters focussing on new asymptotic results; the exploration through simulation of the small sample behaviour of estimators and tests in panel data models; and improvements in methodology. With contributions from leading econometricians, this collection will be essential reading for researchers and graduate students concerned with the use of asymptotic methods in econometric analysis.
In recent years there has been a growing interest in and concern for the development of a sound spatial statistical body of theory. This work has been undertaken by geographers, statisticians, regional scientists, econometricians, and others (e. g., sociologists). It has led to the publication of a number of books, including Cliff and Ord's Spatial Processes (1981), Bartlett's The Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial Statistics (1981), Paelinck and Klaassen's Spatial Economet ics (1979), Ahuja and Schachter's Pattern Models (1983), and Upton and Fingleton's Spatial Data Analysis by Example (1985). The first of these books presents a useful introduction to the topic of spatial autocorrelation, focusing on autocorrelation indices and their sampling distributions. The second of these books is quite brief, but nevertheless furnishes an eloquent introduction to the rela tionship between spatial autoregressive and two-dimensional spectral models. Ripley's book virtually ignores autoregressive and trend surface modelling, and focuses almost solely on point pattern analysis. Paelinck and Klaassen's book closely follows an econometric textbook format, and as a result overlooks much of the important material necessary for successful spatial data analy sis. It almost exclusively addresses distance and gravity models, with some treatment of autoregressive modelling. Pattern Models supplements Cliff and Ord's book, which in combination provide a good introduction to spatial data analysis. Its basic limitation is a preoccupation with the geometry of planar patterns, and hence is very narrow in scope." |
You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Fat Chance - Probability from 0 to 1
Benedict Gross, Joe Harris, …
Hardcover
R1,923
Discovery Miles 19 230
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|