![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Accessible to a general audience with some background in statistics and computing Many examples and extended case studies Illustrations using R and Rstudio A true blend of statistics and computer science -- not just a grab bag of topics from each
The ?nite-dimensional nonlinear complementarity problem (NCP) is a s- tem of ?nitely many nonlinear inequalities in ?nitely many nonnegative variables along with a special equation that expresses the complementary relationship between the variables and corresponding inequalities. This complementarity condition is the key feature distinguishing the NCP from a general inequality system, lies at the heart of all constrained optimi- tion problems in ?nite dimensions, provides a powerful framework for the modeling of equilibria of many kinds, and exhibits a natural link between smooth and nonsmooth mathematics. The ?nite-dimensional variational inequality (VI), which is a generalization of the NCP, provides a broad unifying setting for the study of optimization and equilibrium problems and serves as the main computational framework for the practical solution of a host of continuum problems in the mathematical sciences. The systematic study of the ?nite-dimensional NCP and VI began in the mid-1960s; in a span of four decades, the subject has developed into a very fruitful discipline in the ?eld of mathematical programming. The - velopments include a rich mathematical theory, a host of e?ective solution algorithms, a multitude of interesting connections to numerous disciplines, and a wide range of important applications in engineering and economics. As a result of their broad associations, the literature of the VI/CP has bene?ted from contributions made by mathematicians (pure, applied, and computational), computer scientists, engineers of many kinds (civil, ch- ical, electrical, mechanical, and systems), and economists of diverse exp- tise (agricultural, computational, energy, ?nancial, and spatial).
The basic characteristic of Modern Linear and Nonlinear Econometrics is that it presents a unified approach of modern linear and nonlinear econometrics in a concise and intuitive way. It covers four major parts of modern econometrics: linear and nonlinear estimation and testing, time series analysis, models with categorical and limited dependent variables, and, finally, a thorough analysis of linear and nonlinear panel data modeling. Distinctive features of this handbook are: -A unified approach of both linear and nonlinear econometrics, with an integration of the theory and the practice in modern econometrics. Emphasis on sound theoretical and empirical relevance and intuition. Focus on econometric and statistical methods for the analysis of linear and nonlinear processes in economics and finance, including computational methods and numerical tools. -Completely worked out empirical illustrations are provided throughout, the macroeconomic and microeconomic (household and firm level) data sets of which are available from the internet; these empirical illustrations are taken from finance (e.g. CAPM and derivatives), international economics (e.g. exchange rates), innovation economics (e.g. patenting), business cycle analysis, monetary economics, housing economics, labor and educational economics (e.g. demand for teachers according to gender) and many others. -Exercises are added to the chapters, with a focus on the interpretation of results; several of these exercises involve the use of actual data that are typical for current empirical work and that are made available on the internet. What is also distinguishable in Modern Linear and Nonlinear Econometrics is that every major topic has a number of examples, exercises or case studies. By this learning by doing' method the intention is to prepare the reader to be able to design, develop and successfully finish his or her own research and/or solve real world problems.
Sample data alone never suffice to draw conclusions about populations. Inference always requires assumptions about the population and sampling process. Statistical theory has revealed much about how strength of assumptions affects the precision of point estimates, but has had much less to say about how it affects the identification of population parameters. Indeed, it has been commonplace to think of identification as a binary event - a parameter is either identified or not - and to view point identification as a pre-condition for inference. Yet there is enormous scope for fruitful inference using data and assumptions that partially identify population parameters. This book explains why and shows how. The book presents in a rigorous and thorough manner the main elements of Charles Manski's research on partial identification of probability distributions. One focus is prediction with missing outcome or covariate data. Another is decomposition of finite mixtures, with application to the analysis of contaminated sampling and ecological inference. A third major focus is the analysis of treatment response. Whatever the particular subject under study, the presentation follows a common path. The author first specifies the sampling process generating the available data and asks what may be learned about population parameters using the empirical evidence alone. He then ask how the (typically) setvalued identification regions for these parameters shrink if various assumptions are imposed. The approach to inference that runs throughout the book is deliberately conservative and thoroughly nonparametric. Conservative nonparametric analysis enables researchers to learn from the available data without imposing untenable assumptions. It enables establishment of a domain of consensus among researchers who may hold disparate beliefs about what assumptions are appropriate. Charles F. Manski is Board of Trustees Professor at Northwestern University. He is author of Identification Problems in the Social Sciences and Analog Estimation Methods in Econometrics. He is a Fellow of the American Academy of Arts and Sciences, the American Association for the Advancement of Science, and the Econometric Society.
This conference brought together an international group of fisheries economists from academia, business, government, and inter-governmentalagencies, to consider a coordinated project to build an econometric model of the world trade in groundfish. A number of the conference participants had just spent up to six weeks at Memorial University of Newfoundland working and preparing papers on the project. This volume presents the papers that these scholars produced, plus additional papers prepared by other conference participants. In addition, various lectures and discussionswhich were transcribed from tapes made of the proceedings are included. The introductory essay explains the genesis of the conference, describes the approach taken to modelling the groundfish trade, very briefly summarizes the technical papers, and describes future plans. The project is continuing as planned, and a second conference was held in St. John's in August 1990. The conference was a NATO Advanced Research Workshop and we wish to thank the ScientificAffairs Division ofNATO for their financial support. Additional financial support was received from the Canadian Centre for Fisheries Innovation in St. John's, the Department of Fisheries and Oceans of the Government of Canada, the Department of Fisheries of the Government of Newfoundland and Labrador, Memorial University of Newfoundland and Air Nova; we acknowledge with appreciation their help.
Experiences with Financial Liberalization provides a broad spectrum of policy experiences relating to financial liberalization around the globe since the 1960s. There is a sizable body of theoretical and aggregative empirical literature in this area, but there is little work documenting and analyzing the experiences of individual countries and/or sets of countries. This book is divided into four parts by geographical region - Africa, Asia and Latin America, Central and Eastern Europe, and the Middle East. Aggregative econometric studies cannot substitute for country-wide studies in allowing the researcher to draw lessons for the future, and this volume adds to this relatively small body of literature.
Louis Phlips The stabilisation of primary commodity prices, and the related issue of the stabilisation of export earnings of developing countries, have traditionally been studied without reference to the futures markets (that exist or could exist) for these commodities. These futures markets have in turn been s udied in isolation. The same is true for the new developments on financial markets. Over the last few years, in particular sine the 1985 tin crisis and the October 1987 stock exchange crisis, it has become evident that there are inter actions between commodity, futures, and financial markets and that these inter actions are very important. The more so as trade on futures and financial markets has shown a spectacular increase. This volume brings together a number of recent and unpublished papers on these interactions by leading specialists (and their students). A first set of papers examines how the use of futures markets could help stabilising export earnings of developing countries and how this compares to the rather unsuccessful UNCTAD type interventions via buffer stocks, pegged prices and cartels. A second set of papers faces the fact, largely ignored in the literature, that commodity prices are determined in foreign currencies, with the result that developing countries suffer from the volatility of exchange rates of these currencies (even in cases where commodity prices are relatively stable). Financial markets are thus explicitly linked to futures and commodity markets."
Many problems in statistics and econometrics offer themselves naturally to
optimization in statistics and econometrics, followed by detailed discussion of a relatively new and very powerful optimization heuristic, threshold accepting. The final part consists of many applications of the methods described earlier, encompassing experimental design, model selection, aggregation of tiime series, and censored quantile regression models. Those researching and working in econometrics, statistics and operations research are given the tools to apply optimization heuristic methods in their work. Postgraduate students of statistics and econometrics will find the book provides a good introduction to optimization heuristic methods.
This book was mainly written while I stayed at the Catholic University of Louvain. Professor Anton P. Barten was the one who did not only give me a warm welcome in Louvain, but also supported my research with most valuable comments and constructive criticisms. In addition I benefitted from dis cussions with Erik Schokkaert, Denis de Crombrugghe and Jo Baras on various subjects, such as the small-sample correction of Chapter 9. The arduous task of transferring my neat handwriting into a readable typescript was excellently taken care of by Brs. E. Crabbe and notably Brs. F. Duij sens, even after working hours. Mrs. A. Molders prevented me of making serious abuse of the English language. My admiration for Carien, finally, is an exponential function of the patience and enthusiasm with which she sup ported my research. Chapter I is a general introduction to the subject of linkage models, and it contains few mathematical elaborations. Chapters 2 to 4 use more, but elementary, mathematics, and treat several aspects related to the deriva tion, interpretation and estimation of linkage models. Chapter 2 deals vii tll the theory of import allocation models, Chapter J treats the problem of defining and interpreting elasticities of substitution, while Chapter 4 is concerned with the econometric problems related to the estimation of mul tivariate models with linear restrictions, such as import allocation models."
Continuous-time econometrics is no longer an esoteric subject although most still regard it as such, so much so that it is hardly mentioned in standard textbooks on econometrics. Thanks to the work done in the last 20 years, both the theoretical and the applied side are by now well developed. Methods of estimation have been theoretically elaborated and practically implemented through computer programs. Continuous-time macroeconometric models for different countries have been constructed, estimated and used. Being myself involved in these developments, it was with great pleasure that I accepted the invitation to organize a session on continuous-time econometrics in the context of the International Symposium on Economic Modelling (jointly organized by the University of Urbino and the book series International Studies in Economic Modelling, and co-sponsored by the Consiglio Nazionale delle Ricerche). The reaction of 'continuists' from all over the world was so enthusiastic that I was able to arrange two sessions, one on the theory and the other on the applications. The symposium was held in Urbino on 23-25 July 1990. The papers presented in Urbino have been revised in the light of the discussion at the symposium and the referees' comments. Hence, what is published here should become another standard reference in the field of continuous-time econometrics.
Understanding the structure of a large econometric model is rather like the art of winetasting or like the art of playing a musical instrument. The quality of a wine results from a complex combination of various elements such as its colour which should be clear and crystalline, its smell which can be decomposed into a general aroma and a variety of particular characteristics, more or less persistent depending on the type and the age of the wine, its taste, of course, which again is a complex system whose equilibrium and charm depend on the whole set of ingredients: alcohol, tannin, glycerine, sugar, acidity . . . Similarly, a clarinetist's musicianship depends on the quality of his instrument, on his embouchure, fingering, tonguing and articu lation techniques, on his sense for rhythm, phasing and tone colour. However, the enchantment produced by a Romanee-Conti or by a brilliant performance of Brahm's F minor sonata for clarinet and piano arises from a process which is at the same time time much simpler and much more complex than the straightforward juxtaposition of individual causal relations. In recent years econometricians and macro-economists have been challenged by the problem of keeping abreast with an ever increasing number of increasingly more complex large econometric models. The necessity of developing systematic analytical tools to study the often implicit and hidden structure of these models has become more evident.
This book is the first volume of the International Series in Economic Model ing, a series designed to summarize current issues and procedures in applied modeling within various fields of economics and to offer new or alternative approaches to prevailing problems. In selecting the subject area for the first volume, we were attracted by the area to which applied modeling efforts are increasingly being drawn, regional economics and its associated subfields. Applied modeling is a broad rubric even when the focus is restricted to econometric modeling issues. Regional econometric modeling has posted a record of rapid growth during the last two decades and has become an established field of research and application. Econometric models of states and large urban areas have become commonplace, but the existence of such models does not signal an end to further development of regional econ ometric methods and models. Many issues such as structural specification, level of geographic detail, data constraints, forecasting integrity, and syn thesis with other regional modeling techniques will continue to be sources of concern and will prompt further research efforts. The chapters of this volume reflect many of these issues. A brief synopsis of each contribution is provided below: Richard Weber offers an overview of regional econometric models by discussing theoretical specification, nature of variables, and ultimate useful ness of such models. For an illustration, Weber describes the specification of the econometric model of New Jersey."
This book deals with the methods and practical uses of regression and factor analysis. An exposition is given of ordinary, generalized, two- and three-stage estimates for regression analysis, the method of principal components being applied for factor analysis. When establishing an econometric model, the two ways of analysis complement each other. The model was realized as part of the 'Interplay' research project concerning the economies of the European Common Market countries at the Econometrics Department of the Tilburg School of Economics. The Interplay project aims at: a. elaborating more or less uniformly defined and estimated models; b. clarifying the economic structure and the economic policy possible with the linked models of the European Community countries. Besides the model for the Netherlands published here, the models for Belgium, Italy, West Germany and the United Kingdom are ready for linking and for publishing later on. The econometric model presented in this book and upon which the Interplay model is based comprises eleven structural and twenty-one definitional equations; it is estimated with ordinary, two- and three-stage least squares. The analysis of the model is directed at eliminating multicollinearity, accor ding to D.E. Farrar's and R. Glauber's method. In practice, however, complete elimination of multicollinearity leads to an exclusion of certain relations which is not entirely satisfactory. Economic relations can be dealt with more fully by analyzing the variables involved in detail by factor analysis. In this study factor analysis is also a suitable method for a comparative analysis of different periods."
Econometric theory, as presented in textbooks and the econometric literature generally, is a somewhat disparate collection of findings. Its essential nature is to be a set of demonstrated results that increase over time, each logically based on a specific set of axioms or assumptions, yet at every moment, rather than a finished work, these inevitably form an incomplete body of knowledge. The practice of econometric theory consists of selecting from, applying, and evaluating this literature, so as to test its applicability and range. The creation, development, and use of computer software has led applied economic research into a new age. This book describes the history of econometric computation from 1950 to the present day, based upon an interactive survey involving the collaboration of the many econometricians who have designed and developed this software. It identifies each of the econometric software packages that are made available to and used by economists and econometricians worldwide.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model.
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized convexity off unctions as monotonicity relates to convexity. Generalized monotonicity plays a role in variational inequality problems, complementarity problems and more generally, in equilibrium prob lems."
National income estimates date back to the late 17th century, but only in the half-century since the Second World War have economic accounts developed in their present form, becoming an indispensable tool for macroeconomic analysis, projections and policy formulation. Furthermore, it was in this period that the United Nations issued several versions of a system of national accounts (SNA) to make possible economic comparisons on a consistent basis. The latest version, SNA 1993, published in early 1994, occasioned this collection of essays and commentaries. The three chief objectives of the volume are: to enhance understanding of socioeconomic accounts generally and of SNA 1993 in particular; to offer a critique of SNA 1993, including constructive suggestions for future revisions of the system, making it even more useful for its national and international purposes; and to serve as a textbook, or book of readings in conjunction with SNA 1993, for courses in economic accounts.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
The purpose of this volume is to honour a pioneer in the field of econometrics, A. L. Nagar, on the occasion of his sixtieth birthday. Fourteen econometricians from six countries on four continents have contributed to this project. One of us was his teacher, some of us were his students, many of us were his colleagues, all of us are his friends. Our volume opens with a paper by L. R. Klein which discusses the meaning and role of exogenous variables in struc tural and vector-autoregressive econometric models. Several examples from recent macroeconomic history are presented and the notion of Granger-causality is discussed. This is followed by two papers dealing with an issue of considerable relevance to developing countries, such as India; the measurement of the inequality in the distribution of income. The paper by C. T. West and H. Theil deals with the problem of measuring inequality of all components of total income vvithin a region, rather than just labour income. It applies its results to the regions of the United States. The second paper in this group, by N. Kakwani, derives the large-sample distributions of several popular inequality measures, thus providing a method for drawing large-sample inferences about the differences in inequality between regions. The techniques are applied to the regions of Cote d'Ivoire. The next group of papers is devoted to econometric theory in the context of the dynamic, simultaneous, linear equations model. The first, by P. J."
This book describes the classical axiomatic theories of decision under uncertainty, as well as critiques thereof and alternative theories. It focuses on the meaning of probability, discussing some definitions and surveying their scope of applicability. The behavioral definition of subjective probability serves as a way to present the classical theories, culminating in Savage's theorem. The limitations of this result as a definition of probability lead to two directions - first, similar behavioral definitions of more general theories, such as non-additive probabilities and multiple priors, and second, cognitive derivations based on case-based techniques.
In plying their trade, social scientists often are confronted with significant phenomena that appear incapable of measurement. Past practice would suggest that the way to deal with these cases is to work harder at finding appropriate measures so that standard quantitative analysis can still be applied. Professor Katzner's approach, however is quite different. Rather than concentrating on the construction of measures, he raises the question of how such phenomena can be investigated and understood in the absence of numerical gauges to represent them.
At the time of this volume's publication in 1985, general equilibrium modelling had become a significant area of applied economic research. Its focus was to develop techniques to facilitate economy-wide quantitative assessment of allocative and distributional impacts on policy changes. UK Tax Policy and Applied General Equilibrium Analysis was the first book-length treatment of the development and application of an applied general equilibrium model of the Walrasian type, constructed to analyse UK taxation and subsidy policy. As a whole, UK Tax Policy and Applied General Equilibrium Analysis offers the reader two things. First, it gives a detailed account of the development of an applied general equilibrium model of the UK. Second, it provides results of model experiments which have been designed to inform the policy debate, not only in the UK but also in other countries. It should thus be of interest to both researchers and students undertaking research in the applied general equilibrium area and to policy makers concerned with tax reform. |
You may like...
Subconscious Power - Use Your Inner Mind…
Kimberly Friedmutter
Paperback
Hypnosis and Imagination
Robert Kunzendorf, Nicholas Spanos, …
Hardcover
R4,138
Discovery Miles 41 380
|