Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
In the autumn of 1961 Jan Salomon ('Mars') Cramer was appointed to the newly established chair of econometrics at the University of Amsterdam. This volume is published to commemorate this event. It is well-known how much econometrics has developed over the period under consideration, the 25 years that elapsed between 1961 and 1986. This is specifically true for the areas in which Cramer has been actively interested. We mention the theory and measurement of consumer behaviour; money and income; regression, correla tion and forecasting. In the present volume this development will be high lighted. Sixteen contributions have been sollicited from scholars all over the world who have belonged to the circle of academic friends of Cramer for a shorter or longer part of the period of 25 years. The contributions fall broadly speaking into the four areas mentioned above. Theory and measurement of consumer behaviour is represented by four papers, whereas a fifth paper deals with a related area. Richard Blundell and Costas Meghir devote a paper to the estimation of Engel curves. They apply a discrete choice model to British (individual) data from the Family Expenditure Survey 1981. Their aim is to assess the impact of individual characteristics such as income, demographic structure, location, wages and prices on commodity expenditure."
The econometric consequences of nonstationary data have wide ranging im plications for empirical research in economics. Specifically, these issues have implications for the study of empirical relations such as a money demand func tion that links macroeconomic aggregates: real money balances, real income and a nominal interest rate. Traditional monetary theory predicts that these nonsta tionary series form a cointegrating relation and accordingly, that the dynamics of a vector process comprised of these variables generates distinct patterns. Re cent econometric developments designed to cope with nonstationarities have changed the course of empirical research in the area, but many fundamental challenges, for example the issue of identification, remain. This book represents the efforts undertaken by the authors in recent years in an effort to determine the consequences that nonstationarity has for the study of aggregate money demand relations. We have brought together an empirical methodology that we find useful in conducting empirical research. Some of the work was undertaken during the authors' sabbatical periods and we wish to acknowledge the generous support of Arizona State University and Michigan State University respectively. Professor Hoffman wishes to acknowledge the support of the Fulbright-Hays Foundation that supported sabbattical research in Europe and separate support of the Council of 100 Summer Research Program at Arizona State University."
Lawrence Klein, University of Pennsylvania Jaime Marquez, Federal Reserve BoarrI* All examination of the economics literature over the last twenty years reveals a marked tendency towards polarisation. On the one hand, there has been a propensity to develop theoretical models which have little connection with either empirical verification or problems requiring immediate attention. On the other iland, empirical analyses are generally typified by testing for its own sake, with limited examination of the implications of the results. As a result, the number of papers confronting theory with facts towards the solution of economic problems has been on the decline for years. To fill this growing gap in the literature, we have invited a number of authors to write papers using both theoretical and empirical techniques to address current issues of interest to the profession at large: the US trade deficit and the global implications of policies that attempt to reduce it, the international ramifications of the debt crisis, the international oil market and its implications for the US oil industry, and the development of new econometric techniques. In addressing these issues, each author has approached the subject matter from an eclectic standpoint - that is, avoiding strict adherence to a given doctrine.
The purpose of this volume is to honour a pioneer in the field of econometrics, A. L. Nagar, on the occasion of his sixtieth birthday. Fourteen econometricians from six countries on four continents have contributed to this project. One of us was his teacher, some of us were his students, many of us were his colleagues, all of us are his friends. Our volume opens with a paper by L. R. Klein which discusses the meaning and role of exogenous variables in struc tural and vector-autoregressive econometric models. Several examples from recent macroeconomic history are presented and the notion of Granger-causality is discussed. This is followed by two papers dealing with an issue of considerable relevance to developing countries, such as India; the measurement of the inequality in the distribution of income. The paper by C. T. West and H. Theil deals with the problem of measuring inequality of all components of total income vvithin a region, rather than just labour income. It applies its results to the regions of the United States. The second paper in this group, by N. Kakwani, derives the large-sample distributions of several popular inequality measures, thus providing a method for drawing large-sample inferences about the differences in inequality between regions. The techniques are applied to the regions of Cote d'Ivoire. The next group of papers is devoted to econometric theory in the context of the dynamic, simultaneous, linear equations model. The first, by P. J."
This volume provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross-section and time-series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone. Over the last 30 years much work has been done: investigation of the properties of the applied estimators and test statistics, analysis of dynamic models and the effects of eventual measurement errors, etc. These are just some of the problems addressed by this work. In addition, some specific difficulties associated with the use of panel data, such as attrition, heterogeneity, selectivity bias, pseudo panels etc. have also been explored. The first objective of this book, which takes up Parts I and II, is to give as complete and up-to-date a presentation of these theoretical developments as possible. Part I is concerned with classical linear models and their extensions; Part II deals with nonlinear models and related issues: logit and probit models, latent variable models, incomplete panels and selectivity bias, and point processes. The second objective is to provide insights into the use of panel data in empirical studies. Since the beginning, interest in panel data has been empirically based, and over time has become increasingly important in applied economic studies. This is demonstrated by growing numbers of conferences and special issues of economic journals devoted to the subject. Part III deals with studies in several major fields of applied economics, such as labour and investment demand, labour supply, consumption, transitions on the labour market, and finance. The double emphasis of this book (theoretical and applied), together with the fact that all the chapters have been written by well-known specialists in the field, ensure that it will become a standard textbook for all those who are concerned with the use of panel data in econometrics, whether they are advanced students, professional economists or researchers.
This completely revised and enhanced second edition of the volume first published in 1992 provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross section and time series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone. Much work has been done over the last three decades: investigation of the properties of the applied estimators and test statistics, analysis of dynamic models and the effects of eventual measurement errors, etc. These are just some of the problems addressed by this work. In addition, some specific difficulties associated with the use of panel data are also explored, such as attrition, heterogeneity, selectivity bias, pseudo-panels etc. The second, enhanced edition provides a complete and up to date presentation of these theoretical developments. Part I is concerned with classical linear models and their extensions; Part II deals with nonlinear models and related issues: logit and probit models, latent variable models, incomplete panels and selectivity bias, point processes, etc. Nine additional chapters about instrumental variables and generalized method of moments estimators, duration models, count data models, simulation methods, etc. have been included. This volume also provides insights into the use of panel data in empirical studies. Part III deals with surveys in several major fields of applied economics, such as labour and investment demand, labour supply, consumption, transitions on the labour market, and finance. Two new chapters about foreign investment and production frontiers have been included. Audience: The double emphasis of the book (theoretical and applied), together with the fact that all the chapters have been written by well-known specialists in the field, means that it will become a standard reference for all those concerned with the use of panel data in econometrics: advanced students, professional economists or researchers.
Giovanni Castellani Rector of the University of Venice This book contains the Proceedings of the Conference on "Economic Policy and Control Theory" which was held at the University of Venice (Italy) on 27 January-l February 1985. The goal of the Conference was to survey the main developments of control theory in economics, by emphasizing particularly new achievements in the analysis of dynamic economic models by con trol methods. The development of control theory is strictly related to the development of science and technology in the last forty years. Control theory was indeed applied mainly in engineering, and only in the sixties economists started using control methods for analys ing economic problems, even if some preliminary economic applica tions of calculus of variations, from which control theory was then developed, date back to the twenties. Applications of control theory in economics also had to solve new, complicated, problems, like those encountered in optimal growth models, or like the determination of the appropriate inter temporal social welfare function, of the policy horizon and the relative final state of the system, of the appropriate discount factor. Furthermore, the uncertainty characterizing economic models had to be taken into account, thus giving rise to the development of stochastic control theory in economics."
Thi s book ari ses from The Fourth European Coll oqui urn on Theoret i ca 1 and Quant itat i ve Geography wh i ch was he 1 din Ve 1 dhoven, The Netherlands in September 1985. It contains a series of papers on spatial choice dynamics and dynamical spatial systems which were presented at the colloquium, together with a few other soll icited ones. The book is intended primarily as a state-of-the art review of mainly European research on these two fastly growing problem areas. As a consequence of this decision, the book contains a selection of papers that differs in terms of focus, level of sophistication and conceptual background. Evidently, the dissimination of ideas and computer software is a time-related phenomenon, which in the European context is amplified by differences in language, the profile of geography and the formal training of geographers. The book reflects such differences. It would have been impossible to produce this book without the support of the various European study groups on theoretical and quantitative geography. Without their help the meetings from which this volumes originates would not have been held in the first place. We are also indebted to the Royal Dutch Academy of Science for partly funding the colloquium, and to SISWO and TNOjPSC for providing general support in the organisation of the conference.
The origins of this volume can be traced back to a conference on "Ethics, Economic and Business" organized by Columbia Busi ness School in March of 1993, and held in the splendid facilities of Columbia's Casa Italiana. Preliminary versions of several of the papers were presented at that meeting. In July 1994 the Fields Institute of Mathematical Sciences sponsored a workshop on "Geometry, Topology and Markets" additional papers and more refined versions of the original papers were presented there. They were published in their present versions in Social Choice and Wel fare, volume 14, number 2, 1997. The common aim of these workshops and this volume is to crystallize research in an area which has emerged rapidly in the last fifteen years, the area of topological approaches to social choice and the theory of games. The area is attracting increasing interest from social choice theorists, game theorists, mathematical econ omists and mathematicians, yet there is no authoritative collection of papers in the area. Nor is there any surveyor book to give a perspective and act as a guide to the issues in and contributions to this new area. One of the two aims of this volume is in some measure to play this role: the other aim is of course to present interesting and surprising new results."
The field of econometrics has gone through remarkable changes during the last thirty-five years. Widening its earlier focus on testing macroeconomic theories, it has become a rather comprehensive discipline concemed with the development of statistical methods and their application to the whole spectrum of economic data. This development becomes apparent when looking at the biography of an econometrician whose illustrious research and teaching career started about thirty-five years ago and who will retire very soon after his 65th birthday. This is Gerd Hansen, professor of econometrics at the Christian Albrechts University at Kiel and to whom this volume with contributions from colleagues and students has been dedicated. He has shaped the econometric landscape in and beyond Germany throughout these thirty-five years. At the end of the 1960s he developed one of the first econometric models for the German econ omy which adhered c10sely to the traditions put forth by the Cowles commission."
Measuring productive efficiency for nonprofit organizations has posed a great challenge to applied researchers today. The problem has many facets and diverse implications for a number of disciplines such as economics, applied statistics, management science and information theory. This monograph discusses four major areas, which emphasize the applied economic and econometric as. pects of the production frontier analysis: A. Stochastic frontier theory, B. Data envelopment analysis, C. Clustering and estimation theory, D. Economic and managerial applications Besides containing an up-to-date survey of the mos. t recent developments in the field, the monograph presents several new results and theorems from my own research. These include but are not limited to the following: (1) interface with parametric theory, (2) minimax and robust concepts of production frontier, (3) game-theoretic extension of the Farrell and Johansen models, (4) optimal clustering techniques for data envelopment analysis and (5) the dynamic and stochastic generalizations of the efficiency frontier at the micro and macro levels. In my research work in this field I have received great support and inspiration from Professor Abraham Charnes of the University of Texas at Austin, who has basically founded the technique of data envelopment analysis, developed it and is still expanding it. My interactions with him have been most fruitful and productive. I am deeply grateful to him. Finally, I must record my deep appreciation to my wife and two children for their loving and enduring support. But for their support this work would not have been completed.
The determinants of yield curve dynamics have been thoroughly discussed in finance models. However, little can be said about the macroeconomic factors behind the movements of short- and long-term interest rates as well as the risk compensation demanded by financial investors. By taking on a macro-finance perspective, the book's approach explicitly acknowledges the close feedback between monetary policy, the macroeconomy and financial conditions. Both theoretical and empirical models are applied in order to get a profound understanding of the interlinkages between economic activity, the conduct of monetary policy and the underlying macroeconomic factors of bond price movements. Moreover, the book identifies a broad risk-taking channel of monetary transmission which allows a reassessment of the role of financial constraints; it enables policy makers to develop new guidelines for monetary policy and for financial supervision of how to cope with evolving financial imbalances.
o. Guvenen, University of Paris IX-Dauphine The aim of this publication is to present recent developments in international com modity market model building and policy analysis. This book is based mainly on the research presented at the XlIth International Conference organised by the Applied Econometric Association (AEA) which was held at the University of Zaragoza in Spain. This conference would not have been possible with out the cooperation of the Department of Econometrics of the University of Zaragoza and its Chairman A.A. Grasa. I would like to express my thanks to all contributors. I am grateful to J.H.P. Paelinck, J.P. Ancot, A.J. Hughes Hallett and H. Serbat for their constructive contributions and comments concerning the structure of the book. vii INTRODUCTION o. Guvenen The challenge of increasing complexity and global interdependence at the world level necessitates new modelling approaches and policy analysis at the macroeconomic level, and for commodities. The evaluation of economic modelling.follows the evolution of international economic phenomena. In that interdependent context there is a growing need for forecasting and simulation tools in the analysis of international primary com modity markets."
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
Statistical Methods in Econometrics is appropriate for beginning
graduate courses in mathematical statistics and econometrics in
which the foundations of probability and statistical theory are
developed for application to econometric methodology. Because
econometrics generally requires the study of several unknown
parameters, emphasis is placed on estimation and hypothesis testing
involving several parameters. Accordingly, special attention is
paid to the multivariate normal and the distribution of quadratic
forms. Lagrange multiplier tests are discussed in considerable
detail, along with the traditional likelihood ration and Wald
tests. Characteristic functions and their properties are fully
exploited. Also asymptotic distribution theory, usually given only
cursory treatment, is discussed in detail.
This work grew out of a series of investigations begun by the authors in 1980 and 1981. Specifically the authors pursued two lines of inquiry. First, to advance the state of the theoretical lit- erature to better explain the crises of liberalization which seemed to be afflicting the third world in general and Latin America in particular. To do this, several different kinds of models were in- vestigated and adapted. These are presented in Chapters 2, 3 and 5. Secondly an analysis of the empirical evidence was conducted in order to gain insight into the processes that were thought to be occurring and the theoretical models that were being developed. Some of this work appears in Chapters 3, 4, 5 and 6. Other work by the authors on these issues has been published elsewhere and is referenced herein. There are a great many people whose work and whose com- ments have influenced this work. We would like to especially thank Guillermo Calvo, Michael Connolly, Sebastian Edwards, Roque Fernandez, Michael Darby, Robert Clower, Neil Wallace, John Kareken, Paul McNelis, Jeffrey Nugent, Jaime Marquez, Lee Ohanian, Leroy Laney, Jorge Braga de Macedo, Dale Henderson, vii Matthew Canzoneiri, Arthur Laffer, Marc Miles, and George Von Furstenberg whose ideas and comments gave rise to much of our work. We would like to thank Suh Lee for his assistance with the computations in Chapter 5.
Max-Min problems are two-step allocation problems in which one side must make his move knowing that the other side will then learn what the move is and optimally counter. They are fundamental in parti cular to military weapons-selection problems involving large systems such as Minuteman or Polaris, where the systems in the mix are so large that they cannot be concealed from an opponent. One must then expect the opponent to determine on an optlmal mixture of, in the case men tioned above, anti-Minuteman and anti-submarine effort. The author's first introduction to a problem of Max-Min type occurred at The RAND Corporation about 1951. One side allocates anti-missile defenses to various cities. The other side observes this allocation and then allocates missiles to those cities. If F(x, y) denotes the total residual value of the cities after the attack, with x denoting the defender's strategy and y the attacker's, the problem is then to find Max MinF(x, y) = Max MinF(x, y)] ."
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
This book explains in simple settings the fundamental ideas of financial market modelling and derivative pricing, using the no-arbitrage principle. Relatively elementary mathematics leads to powerful notions and techniques - such as viability, completeness, self-financing and replicating strategies, arbitrage and equivalent martingale measures - which are directly applicable in practice. The general methods are applied in detail to pricing and hedging European and American options within the Cox-Ross-Rubinstein (CRR) binomial tree model. A simple approach to discrete interest rate models is included, which, though elementary, has some novel features. All proofs are written in a user-friendly manner, with each step carefully explained and following a natural flow of thought. In this way the student learns how to tackle new problems.
The book deals with collusion between firms on both sides of a market that is immune to deviations by coalitions. We study this issue using an infinitely countably repeated game with discounting of future single period payoffs. A strict strong perfect equilibrium is the main solution concept that we apply. It requires that no coalition of players in no subgame can weakly Pareto improve the vector of continuation average discounted payoffs of its members by a deviation. If the sum of firms' average discounted profits is maximized along the equilibrium path then the equilibrium output of each type of good is produced with the lowest possible costs. If, in addition, all buyers are retailers (i.e., they resell the goods purchased in the analyzed market in a retail market) then the equilibrium vector of the quantities sold in the retail market is sold with the lowest possible selling costs. We specify sufficient conditions under which collusion increases consumer welfare.
Financial globalization has increased the significance of methods used in the evaluation of country risk, one of the major research topics in economics and finance. Written by experts in the fields of multicriteria methodology, credit risk assessment, operations research, and financial management, this book develops a comprehensive framework for evaluating models based on several classification techniques that emerge from different theoretical directions. This book compares different statistical and data mining techniques, noting the advantages of each method, and introduces new multicriteria methodologies that are important to country risk modeling. Key topics include: (1) A review of country risk definitions and an overview of the most recent tools in country risk management, (2) In-depth analysis of statistical, econometric and non-parametric classification techniques, (3) Several real-world applications of the methodologies described throughout the text, (4) Future research directions for country risk assessment problems. This work is a useful toolkit for economists, financial managers, bank managers, operations researchers, management scientists, and risk analysts. Moreover, the book can also be used as a supplementary text for graduate courses in finance and financial risk management.
The aim of the book is to provide an overview of risk management in life insurance companies. The focus is twofold: (1) to provide a broad view of the different topics needed for risk management and (2) to provide the necessary tools and techniques to concretely apply them in practice. Much emphasis has been put into the presentation of the book so that it presents the theory in a simple but sound manner. The first chapters deal with valuation concepts which are defined and analysed, the emphasis is on understanding the risks in corresponding assets and liabilities such as bonds, shares and also insurance liabilities. In the following chapters risk appetite and key insurance processes and their risks are presented and analysed. This more general treatment is followed by chapters describing asset risks, insurance risks and operational risks - the application of models and reporting of the corresponding risks is central. Next, the risks of insurance companies and of special insurance products are looked at. The aim is to show the intrinsic risks in some particular products and the way they can be analysed. The book finishes with emerging risks and risk management from a regulatory point of view, the standard model of Solvency II and the Swiss Solvency Test are analysed and explained. The book has several mathematical appendices which deal with the basic mathematical tools, e.g. probability theory, stochastic processes, Markov chains and a stochastic life insurance model based on Markov chains. Moreover, the appendices look at the mathematical formulation of abstract valuation concepts such as replicating portfolios, state space deflators, arbitrage free pricing and the valuation of unit linked products with guarantees. The various concepts in the book are supported by tables and figures. |
You may like...
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,432
Discovery Miles 64 320
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
The Economics of the Super Bowl…
Yvan J Kelly, David Berri, …
Hardcover
R1,875
Discovery Miles 18 750
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,055
Discovery Miles 30 550
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
|