![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
The ?nite-dimensional nonlinear complementarity problem (NCP) is a s- tem of ?nitely many nonlinear inequalities in ?nitely many nonnegative variables along with a special equation that expresses the complementary relationship between the variables and corresponding inequalities. This complementarity condition is the key feature distinguishing the NCP from a general inequality system, lies at the heart of all constrained optimi- tion problems in ?nite dimensions, provides a powerful framework for the modeling of equilibria of many kinds, and exhibits a natural link between smooth and nonsmooth mathematics. The ?nite-dimensional variational inequality (VI), which is a generalization of the NCP, provides a broad unifying setting for the study of optimization and equilibrium problems and serves as the main computational framework for the practical solution of a host of continuum problems in the mathematical sciences. The systematic study of the ?nite-dimensional NCP and VI began in the mid-1960s; in a span of four decades, the subject has developed into a very fruitful discipline in the ?eld of mathematical programming. The - velopments include a rich mathematical theory, a host of e?ective solution algorithms, a multitude of interesting connections to numerous disciplines, and a wide range of important applications in engineering and economics. As a result of their broad associations, the literature of the VI/CP has bene?ted from contributions made by mathematicians (pure, applied, and computational), computer scientists, engineers of many kinds (civil, ch- ical, electrical, mechanical, and systems), and economists of diverse exp- tise (agricultural, computational, energy, ?nancial, and spatial).
The basic characteristic of Modern Linear and Nonlinear Econometrics is that it presents a unified approach of modern linear and nonlinear econometrics in a concise and intuitive way. It covers four major parts of modern econometrics: linear and nonlinear estimation and testing, time series analysis, models with categorical and limited dependent variables, and, finally, a thorough analysis of linear and nonlinear panel data modeling. Distinctive features of this handbook are: -A unified approach of both linear and nonlinear econometrics, with an integration of the theory and the practice in modern econometrics. Emphasis on sound theoretical and empirical relevance and intuition. Focus on econometric and statistical methods for the analysis of linear and nonlinear processes in economics and finance, including computational methods and numerical tools. -Completely worked out empirical illustrations are provided throughout, the macroeconomic and microeconomic (household and firm level) data sets of which are available from the internet; these empirical illustrations are taken from finance (e.g. CAPM and derivatives), international economics (e.g. exchange rates), innovation economics (e.g. patenting), business cycle analysis, monetary economics, housing economics, labor and educational economics (e.g. demand for teachers according to gender) and many others. -Exercises are added to the chapters, with a focus on the interpretation of results; several of these exercises involve the use of actual data that are typical for current empirical work and that are made available on the internet. What is also distinguishable in Modern Linear and Nonlinear Econometrics is that every major topic has a number of examples, exercises or case studies. By this learning by doing' method the intention is to prepare the reader to be able to design, develop and successfully finish his or her own research and/or solve real world problems.
Sample data alone never suffice to draw conclusions about populations. Inference always requires assumptions about the population and sampling process. Statistical theory has revealed much about how strength of assumptions affects the precision of point estimates, but has had much less to say about how it affects the identification of population parameters. Indeed, it has been commonplace to think of identification as a binary event - a parameter is either identified or not - and to view point identification as a pre-condition for inference. Yet there is enormous scope for fruitful inference using data and assumptions that partially identify population parameters. This book explains why and shows how. The book presents in a rigorous and thorough manner the main elements of Charles Manski's research on partial identification of probability distributions. One focus is prediction with missing outcome or covariate data. Another is decomposition of finite mixtures, with application to the analysis of contaminated sampling and ecological inference. A third major focus is the analysis of treatment response. Whatever the particular subject under study, the presentation follows a common path. The author first specifies the sampling process generating the available data and asks what may be learned about population parameters using the empirical evidence alone. He then ask how the (typically) setvalued identification regions for these parameters shrink if various assumptions are imposed. The approach to inference that runs throughout the book is deliberately conservative and thoroughly nonparametric. Conservative nonparametric analysis enables researchers to learn from the available data without imposing untenable assumptions. It enables establishment of a domain of consensus among researchers who may hold disparate beliefs about what assumptions are appropriate. Charles F. Manski is Board of Trustees Professor at Northwestern University. He is author of Identification Problems in the Social Sciences and Analog Estimation Methods in Econometrics. He is a Fellow of the American Academy of Arts and Sciences, the American Association for the Advancement of Science, and the Econometric Society.
To derive rational and convincible solutions to practical decision making problems in complex and hierarchical human organizations, the decision making problems are formulated as relevant mathematical programming problems which are solved by developing optimization techniques so as to exploit characteristics or structural features of the formulated problems. In particular, for resolving con?ict in decision making in hierarchical managerial or public organizations, the multi level formula tion of the mathematical programming problems has been often employed together with the solution concept of Stackelberg equilibrium. However, weconceivethatapairoftheconventionalformulationandthesolution concept is not always suf?cient to cope with a large variety of decision making situations in actual hierarchical organizations. The following issues should be taken into consideration in expression and formulation of decision making problems. Informulationofmathematicalprogrammingproblems, itistacitlysupposedthat decisions are made by a single person while game theory deals with economic be havior of multiple decision makers with fully rational judgment. Because two level mathematical programming problems are interpreted as static Stackelberg games, multi level mathematical programming is relevant to noncooperative game theory; in conventional multi level mathematical programming models employing the so lution concept of Stackelberg equilibrium, it is assumed that there is no communi cation among decision makers, or they do not make any binding agreement even if there exists such communication. However, for decision making problems in such as decentralized large ?rms with divisional independence, it is quite natural to sup pose that there exists communication and some cooperative relationship among the decision maker
In this testament to the distinguished career of H.S. Houthakker a number of Professor Houthakker's friends, former colleagues and former students offer essays which build upon and extend his many contributions to economics in aggregation, consumption, growth and trade. Among the many distinguished contributors are Paul Samuelson, Werner Hildenbrand, John Muellbauer and Lester Telser. The book also includes four previously unpublished papers and notes by its distinguished dedicatee.
This conference brought together an international group of fisheries economists from academia, business, government, and inter-governmentalagencies, to consider a coordinated project to build an econometric model of the world trade in groundfish. A number of the conference participants had just spent up to six weeks at Memorial University of Newfoundland working and preparing papers on the project. This volume presents the papers that these scholars produced, plus additional papers prepared by other conference participants. In addition, various lectures and discussionswhich were transcribed from tapes made of the proceedings are included. The introductory essay explains the genesis of the conference, describes the approach taken to modelling the groundfish trade, very briefly summarizes the technical papers, and describes future plans. The project is continuing as planned, and a second conference was held in St. John's in August 1990. The conference was a NATO Advanced Research Workshop and we wish to thank the ScientificAffairs Division ofNATO for their financial support. Additional financial support was received from the Canadian Centre for Fisheries Innovation in St. John's, the Department of Fisheries and Oceans of the Government of Canada, the Department of Fisheries of the Government of Newfoundland and Labrador, Memorial University of Newfoundland and Air Nova; we acknowledge with appreciation their help.
Experiences with Financial Liberalization provides a broad spectrum of policy experiences relating to financial liberalization around the globe since the 1960s. There is a sizable body of theoretical and aggregative empirical literature in this area, but there is little work documenting and analyzing the experiences of individual countries and/or sets of countries. This book is divided into four parts by geographical region - Africa, Asia and Latin America, Central and Eastern Europe, and the Middle East. Aggregative econometric studies cannot substitute for country-wide studies in allowing the researcher to draw lessons for the future, and this volume adds to this relatively small body of literature.
Econometric theory, as presented in textbooks and the econometric literature generally, is a somewhat disparate collection of findings. Its essential nature is to be a set of demonstrated results that increase over time, each logically based on a specific set of axioms or assumptions, yet at every moment, rather than a finished work, these inevitably form an incomplete body of knowledge. The practice of econometric theory consists of selecting from, applying, and evaluating this literature, so as to test its applicability and range. The creation, development, and use of computer software has led applied economic research into a new age. This book describes the history of econometric computation from 1950 to the present day, based upon an interactive survey involving the collaboration of the many econometricians who have designed and developed this software. It identifies each of the econometric software packages that are made available to and used by economists and econometricians worldwide.
This book was mainly written while I stayed at the Catholic University of Louvain. Professor Anton P. Barten was the one who did not only give me a warm welcome in Louvain, but also supported my research with most valuable comments and constructive criticisms. In addition I benefitted from dis cussions with Erik Schokkaert, Denis de Crombrugghe and Jo Baras on various subjects, such as the small-sample correction of Chapter 9. The arduous task of transferring my neat handwriting into a readable typescript was excellently taken care of by Brs. E. Crabbe and notably Brs. F. Duij sens, even after working hours. Mrs. A. Molders prevented me of making serious abuse of the English language. My admiration for Carien, finally, is an exponential function of the patience and enthusiasm with which she sup ported my research. Chapter I is a general introduction to the subject of linkage models, and it contains few mathematical elaborations. Chapters 2 to 4 use more, but elementary, mathematics, and treat several aspects related to the deriva tion, interpretation and estimation of linkage models. Chapter 2 deals vii tll the theory of import allocation models, Chapter J treats the problem of defining and interpreting elasticities of substitution, while Chapter 4 is concerned with the econometric problems related to the estimation of mul tivariate models with linear restrictions, such as import allocation models."
This book covers recent advances in efficiency evaluations, most notably Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) methods. It introduces the underlying theories, shows how to make the relevant calculations and discusses applications. The aim is to make the reader aware of the pros and cons of the different methods and to show how to use these methods in both standard and non-standard cases. Several software packages have been developed to solve some of the most common DEA and SFA models. This book relies on R, a free, open source software environment for statistical computing and graphics. This enables the reader to solve not only standard problems, but also many other problem variants. Using R, one can focus on understanding the context and developing a good model. One is not restricted to predefined model variants and to a one-size-fits-all approach. To facilitate the use of R, the authors have developed an R package called Benchmarking, which implements the main methods within both DEA and SFA. The book uses mathematical formulations of models and assumptions, but it de-emphasizes the formal proofs - in part by placing them in appendices -- or by referring to the original sources. Moreover, the book emphasizes the usage of the theories and the interpretations of the mathematical formulations. It includes a series of small examples, graphical illustrations, simple extensions and questions to think about. Also, it combines the formal models with less formal economic and organizational thinking. Last but not least it discusses some larger applications with significant practical impacts, including the design of benchmarking-based regulations of energy companies in different European countries, and the development of merger control programs for competition authorities.
Many problems in statistics and econometrics offer themselves naturally to
optimization in statistics and econometrics, followed by detailed discussion of a relatively new and very powerful optimization heuristic, threshold accepting. The final part consists of many applications of the methods described earlier, encompassing experimental design, model selection, aggregation of tiime series, and censored quantile regression models. Those researching and working in econometrics, statistics and operations research are given the tools to apply optimization heuristic methods in their work. Postgraduate students of statistics and econometrics will find the book provides a good introduction to optimization heuristic methods.
Continuous-time econometrics is no longer an esoteric subject although most still regard it as such, so much so that it is hardly mentioned in standard textbooks on econometrics. Thanks to the work done in the last 20 years, both the theoretical and the applied side are by now well developed. Methods of estimation have been theoretically elaborated and practically implemented through computer programs. Continuous-time macroeconometric models for different countries have been constructed, estimated and used. Being myself involved in these developments, it was with great pleasure that I accepted the invitation to organize a session on continuous-time econometrics in the context of the International Symposium on Economic Modelling (jointly organized by the University of Urbino and the book series International Studies in Economic Modelling, and co-sponsored by the Consiglio Nazionale delle Ricerche). The reaction of 'continuists' from all over the world was so enthusiastic that I was able to arrange two sessions, one on the theory and the other on the applications. The symposium was held in Urbino on 23-25 July 1990. The papers presented in Urbino have been revised in the light of the discussion at the symposium and the referees' comments. Hence, what is published here should become another standard reference in the field of continuous-time econometrics.
Understanding the structure of a large econometric model is rather like the art of winetasting or like the art of playing a musical instrument. The quality of a wine results from a complex combination of various elements such as its colour which should be clear and crystalline, its smell which can be decomposed into a general aroma and a variety of particular characteristics, more or less persistent depending on the type and the age of the wine, its taste, of course, which again is a complex system whose equilibrium and charm depend on the whole set of ingredients: alcohol, tannin, glycerine, sugar, acidity . . . Similarly, a clarinetist's musicianship depends on the quality of his instrument, on his embouchure, fingering, tonguing and articu lation techniques, on his sense for rhythm, phasing and tone colour. However, the enchantment produced by a Romanee-Conti or by a brilliant performance of Brahm's F minor sonata for clarinet and piano arises from a process which is at the same time time much simpler and much more complex than the straightforward juxtaposition of individual causal relations. In recent years econometricians and macro-economists have been challenged by the problem of keeping abreast with an ever increasing number of increasingly more complex large econometric models. The necessity of developing systematic analytical tools to study the often implicit and hidden structure of these models has become more evident.
The purpose of this volume is to honour a pioneer in the field of econometrics, A. L. Nagar, on the occasion of his sixtieth birthday. Fourteen econometricians from six countries on four continents have contributed to this project. One of us was his teacher, some of us were his students, many of us were his colleagues, all of us are his friends. Our volume opens with a paper by L. R. Klein which discusses the meaning and role of exogenous variables in struc tural and vector-autoregressive econometric models. Several examples from recent macroeconomic history are presented and the notion of Granger-causality is discussed. This is followed by two papers dealing with an issue of considerable relevance to developing countries, such as India; the measurement of the inequality in the distribution of income. The paper by C. T. West and H. Theil deals with the problem of measuring inequality of all components of total income vvithin a region, rather than just labour income. It applies its results to the regions of the United States. The second paper in this group, by N. Kakwani, derives the large-sample distributions of several popular inequality measures, thus providing a method for drawing large-sample inferences about the differences in inequality between regions. The techniques are applied to the regions of Cote d'Ivoire. The next group of papers is devoted to econometric theory in the context of the dynamic, simultaneous, linear equations model. The first, by P. J."
This book provides a game theoretic model of interaction among VoIP telecommunications providers regarding their willingness to enter peering agreements with one another. The author shows that the incentive to peer is generally based on savings from otherwise payable long distance fees. At the same time, termination fees can have a countering and dominant effect, resulting in an environment in which VoIP firms decide against peering. Various scenarios of peering and rules for allocation of the savings are considered. The first part covers the relevant aspects of game theory and network theory, trying to give an overview of the concepts required in the subsequent application. The second part of the book introduces first a model of how the savings from peering can be calculated and then turns to the actual formation of peering relationships between VoIP firms. The conditions under which firms are willing to peer are then described, considering the possible influence of a regulatory body.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized convexity off unctions as monotonicity relates to convexity. Generalized monotonicity plays a role in variational inequality problems, complementarity problems and more generally, in equilibrium prob lems."
Unlike uncertain dynamical systems in physical sciences where models for prediction are somewhat given to us by physical laws, uncertain dynamical systems in economics need statistical models. In this context, modeling and optimization surface as basic ingredients for fruitful applications. This volume concentrates on the current methodology of copulas and maximum entropy optimization. This volume contains main research presentations at the Sixth International Conference of the Thailand Econometrics Society held at the Faculty of Economics, Chiang Mai University, Thailand, during January 10-11, 2013. It consists of keynote addresses, theoretical and applied contributions. These contributions to Econometrics are somewhat centered around the theme of Copulas and Maximum Entropy Econometrics. The method of copulas is applied to a variety of economic problems where multivariate model building and correlation analysis are needed. As for the art of choosing copulas in practical problems, the principle of maximum entropy surfaces as a potential way to do so. The state-of-the-art of Maximum Entropy Econometrics is presented in the first keynote address, while the second keynote address focusses on testing stationarity in economic time series data.
This book grew out of a 'Doctorat D'Etat' thesis presented at the University of Dijon-Institut Mathematique Economiques (lME). It aims to show that quantity rationing theory provides the means of improving macroeconometric modelling in the study of struc- tural changes. The empirical results presented in the last chapter (concerning Portuguese economy) and in the last Appendix (con- cerning the French economy), although preliminary, suggested that the effort is rewarding and should be continued. My debts are many. An important part of the research work was accomplished during my visit to the Institut National de la Statistique et des Etudes Economiques (lNSEE, Paris), where I have beneficted from stimulating discussions (particularly with P. Villa) and infor- matical support. I have also received comments and suggestions from R. Quandt, J.-J. Laffont, P. Kooiman and P.-Y. Henin. I am specially indebted to P. Balestra for encouraging and valuable discussions, particularly in the field of econometric methods. My thanks go also to an anonymous referee. His constructive criticism and suggestions resulted in a number of improvements to an earlier version of this book. I cannot forget my friend A. Costa from BP A (Porto) who has helped me in the preparation of this work. Last but not least, I would like to thank my wife for her encouragement and patience throughout these years. Of course, I am the only one responsible for any remaining errors.
The optimisation of economic systems over time, and in an uncertain environment, is central to the study of economic behaviour. The behaviour of rational decision makers, whether they are market agents, firms, or governments and their agencies, is governed by decisions designed to seeure the best outcomes subject to the perceived information and economic responses (inlcuding those of other agents). Economic behaviour has therefore to be analysed in terms of the outcomes of a multiperiod stochastic optimisation process containing four main components: the economic responses (the dynamic constraints, represented by an economic model); the objec tive function (the goals and their priorities); the conditioning information (expected exogenous events and the expected future state of the economy); and risk manage ment (how uncertainties are accommodated). The papers presented in this book all analyse some aspect of economic behaviour related to the objectives, information, or risk components of the decision process. While the construction of economic models obviously also has a vital role to play, that component has received much greater (or almost exclusive) attention elsewhere. These papers examine optimising behaviour in a wide range of economic problems, both theoretical and applied. They reflect a variety of concerns: economic responses under rational expectations; the Lucas critique and optimal fiscal or monetary poli eies; market management; partly endogenous goals; evaluating government reactions; locational decisions; uncertainty and information structures; and forecasting with endogenous reactions."
National income estimates date back to the late 17th century, but only in the half-century since the Second World War have economic accounts developed in their present form, becoming an indispensable tool for macroeconomic analysis, projections and policy formulation. Furthermore, it was in this period that the United Nations issued several versions of a system of national accounts (SNA) to make possible economic comparisons on a consistent basis. The latest version, SNA 1993, published in early 1994, occasioned this collection of essays and commentaries. The three chief objectives of the volume are: to enhance understanding of socioeconomic accounts generally and of SNA 1993 in particular; to offer a critique of SNA 1993, including constructive suggestions for future revisions of the system, making it even more useful for its national and international purposes; and to serve as a textbook, or book of readings in conjunction with SNA 1993, for courses in economic accounts.
In the autumn of 1961 Jan Salomon ('Mars') Cramer was appointed to the newly established chair of econometrics at the University of Amsterdam. This volume is published to commemorate this event. It is well-known how much econometrics has developed over the period under consideration, the 25 years that elapsed between 1961 and 1986. This is specifically true for the areas in which Cramer has been actively interested. We mention the theory and measurement of consumer behaviour; money and income; regression, correla tion and forecasting. In the present volume this development will be high lighted. Sixteen contributions have been sollicited from scholars all over the world who have belonged to the circle of academic friends of Cramer for a shorter or longer part of the period of 25 years. The contributions fall broadly speaking into the four areas mentioned above. Theory and measurement of consumer behaviour is represented by four papers, whereas a fifth paper deals with a related area. Richard Blundell and Costas Meghir devote a paper to the estimation of Engel curves. They apply a discrete choice model to British (individual) data from the Family Expenditure Survey 1981. Their aim is to assess the impact of individual characteristics such as income, demographic structure, location, wages and prices on commodity expenditure." |
You may like...
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
|