![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
o. Guvenen, University of Paris IX-Dauphine The aim of this publication is to present recent developments in international com modity market model building and policy analysis. This book is based mainly on the research presented at the XlIth International Conference organised by the Applied Econometric Association (AEA) which was held at the University of Zaragoza in Spain. This conference would not have been possible with out the cooperation of the Department of Econometrics of the University of Zaragoza and its Chairman A.A. Grasa. I would like to express my thanks to all contributors. I am grateful to J.H.P. Paelinck, J.P. Ancot, A.J. Hughes Hallett and H. Serbat for their constructive contributions and comments concerning the structure of the book. vii INTRODUCTION o. Guvenen The challenge of increasing complexity and global interdependence at the world level necessitates new modelling approaches and policy analysis at the macroeconomic level, and for commodities. The evaluation of economic modelling.follows the evolution of international economic phenomena. In that interdependent context there is a growing need for forecasting and simulation tools in the analysis of international primary com modity markets."
Econometric models are made up of assumptions which never exactly match reality. Among the most contested ones is the requirement that the coefficients of an econometric model remain stable over time. Recent years have therefore seen numerous attempts to test for it or to model possible structural change when it can no longer be ignored. This collection of papers from Empirical Economics mirrors part of this development. The point of departure of most studies in this volume is the standard linear regression model Yt = x;fJt + U (t = I, ... , 1), t where notation is obvious and where the index t emphasises the fact that structural change is mostly discussed and encountered in a time series context. It is much less of a problem for cross section data, although many tests apply there as well. The null hypothesis of most tests for structural change is that fJt = fJo for all t, i.e. that the same regression applies to all time periods in the sample and that the disturbances u are well behaved. The well known Chow test for instance assumes t that there is a single structural shift at a known point in time, i.e. that fJt = fJo (t< t*), and fJt = fJo + t1fJ (t"?:. t*), where t* is known.
These proceedings, from a conference held at the Federal Reserve Bank of St. Louis on October 17-18, 1991, attempted to layout what we currently know about aggregate economic fluctuations. Identifying what we know inevitably reveals what we do not know about such fluctuations as well. From the vantage point of where the conference's participants view our current understanding to be, these proceedings can be seen as suggesting an agenda for further research. The conference was divided into five sections. It began with the formu lation of an empirical definition of the "business cycle" and a recitation of the stylized facts that must be explained by any theory that purports to capture the business cycle's essence. After outlining the historical develop ment and key features of the current "theories" of business cycles, the conference evaluated these theories on the basis of their ability to explain the facts. Included in this evaluation was a discussion of whether (and how) the competing theories could be distinguished empirically. The conference then examined the implications for policy of what is known and not known about business cycles. A panel discussion closed the conference, high lighting important unresolved theoretical and empirical issues that should be taken up in future business cycle research. What Is a Business Cycle? Before gaining a genuine understanding of business cycles, economists must agree and be clear about what they mean when they refer to the cycle."
In this Element and its accompanying second Element, A Practical Introduction to Regression Discontinuity Designs: Extensions, Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an accessible and practical guide for the analysis and interpretation of regression discontinuity (RD) designs that encourages the use of a common set of practices and facilitates the accumulation of RD-based empirical evidence. In this Element, the authors discuss the foundations of the canonical Sharp RD design, which has the following features: (i) the score is continuously distributed and has only one dimension, (ii) there is only one cutoff, and (iii) compliance with the treatment assignment is perfect. In the second Element, the authors discuss practical and conceptual extensions to this basic RD setup.
Macroeconomic Policy in the Canadian Economy investigates developments in Canada over the last forty years, using recent advances in the field of applied econometrics. In particular, the book analyzes the theoretical foundations of public sector activities and evaluates the several theories of government growth. Issues of convergence are also investigated as they manifest themselves in per capita income across Canadian provinces, and as to how successful government income equalization policies have been in furthering such convergence. Moreover, the openness of the Canadian economy is investigated in terms of the importance of exports on GDP growth and of its participation in the world of an internationally integrated capital market. The book also analyzes monetary policy issues and investigates the role of monetary aggregates and the effectiveness of monetary policy. Finally, it addresses the issue of the existence or not of electoral and partisan cycles in Canada, by incorporating both fiscal and monetary principles and applying them to the lively world of Canadian politics.
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
Investment is crucial to the development of a nations economy and welfare. In contrast to the situation in the United States, investment activity in Europe has been quite modest over the past few years. This volume gathers together a number of papers by prominent researchers in the field of investment. It provides an overview of recent developments in this area and presents new empirical findings on the determinants and implications of the investment process in European countries. Among the topics examined are the role played by taxation, uncertainty and the financial systems, as well as the relevance of corporate governance to the investment process. Two chapters are dedicated to infrastructure investment and foreign direct investment.
All former Soviet Union countries experience their past as a heavy burden. It led to the centralisation of scientific personnel, the separation of research from teaching at universities, and a concentration of certain branches of technology in different parts of the Union. This has given rise to a one-sided technology and science potential which frequently cannot be sufficiently supported due to a lack of adequate finance. Cooperation between the Baltic States themselves is often hampered by an exaggerated sense of national identity, and international cooperation can be made difficult by linguistic problems. A critical issue is finance. The Baltic States themselves are experiencing budgetary constraints, and the West is cutting back on funding. The analytical issues dealt with here include specific questions, such as in the sectors of energy policy, electrical equipment and electronics, and environmental considerations. The transfer of technology is also discussed, as is security: there is the possibility that science and scientific results can be obtained from the former Soviet Union at low cost by the criminal community.
Empirical Studies In Applied Economics presents nine previously unpublished analyses in monograph form. In this work, the topics are presented so that each chapter stands on its own. The emphasis is on the applications but attention is also given to the econometric and statistical issues for advanced readers. Econometric methods include multivariate regression analysis, limited dependent variable analysis, and other maximum likelihood techniques. The empirical topics include the measurement of competition and market power in natural gas transportation markets and in the pharmaceutical market for chemotherapy drugs. Additional topics include an empirical analysis of NFL football demand, the accuracy of an econometric model for mail demand, and the allocation of police services in rural Alaska. Other chapters consider the valuation of technology patents and the determination of patent scope, duration, and reasonable royalty, and the reaction of financial markets to health scares in the fast-food industry. Finally, two chapters are devoted to the theory and testing of synergistic health effects from the combined exposure to asbestos and cigarette smoking.
Commerce, Complexity, and Evolution is a significant contribution to the paradigm - straddling economics, finance, marketing, and management - which acknowledges that commercial systems are evolutionary, and must therefore be analysed using evolutionary tools. Evolutionary systems display complicated behaviours which are to a significant degree generated endogenously, rather than being solely the product of exogenous shocks, hence the conjunction of complexity with evolution. This volume considers a wide range of systems, from the entire economy at one extreme to the behaviour of single markets at the other. The papers are united by methodologies which at their core are evolutionary, though the techniques cover a wide range, from philosophical discourse to differential equations, genetic algorithms, multi-agent simulations and cellular automata. Issues considered include the dynamics of debt-deflation, stock management in a complex environment, interactions between consumers and its effect upon market behaviour, and nonlinear methods to profit from financial market volatility.
Written to complement the second edition of best-selling textbook Introductory Econometrics for Finance, this book provides a comprehensive introduction to the use of the Regression Analysis of Time Series (RATS) software for modelling in finance and beyond. It provides numerous worked examples with carefully annotated code and detailed explanations of the outputs, giving readers the knowledge and confidence to use the software for their own research and to interpret their own results. A wide variety of important modelling approaches are covered, including such topics as time-series analysis and forecasting, volatility modelling, limited dependent variable and panel methods, switching models and simulations methods. The book is supported by an accompanying website containing freely downloadable data and RATS instructions.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
Agent-Based Computer Simulation of Dichotomous Economic Growth reports a project in agent-based computer stimulation of processes of economic growth in a population of boundedly rational learning agents. The study is an exercise in comparative simulation. That is, the same family of growth models will be simulated under different assumptions about the nature of the learning process and details of the production and growth processes. The purpose of this procedure is to establish a relationship between the assumptions and the simulation results. The study brings together a number of theoretical and technical developments, only some of which may be familiar to any particular reader. In this first chapter, some issues in economic growth are reviewed and the objectives of the study are outlined. In the second chapter, the simulation techniques are introduced and illustrated with baseline simulations of boundedly rational learning processes that do not involve the complications of dealing with long-run economic growth. The third chapter sketches the consensus modern theory of economic growth which is the starting point for further study. In the fourth chapter, a family of steady growth models are simulated, bringing the simulation, growth and learning aspects of the study together. In subsequent chapters, variants on the growth model are explored in a similar way. The ninth chapter introduces trade, with a spacial trading model that is combined with the growth model in the tenth chapter. The book returns again and again to the key question: to what extent can the simulations `explain' the puzzles of economic growth, and particularly the key puzzle of dichotomization, by constructing growth and learning processes that produce the puzzling results? And just what assumptions of the simulations are most predictable associated with the puzzling results?
Structural Funds: Growth, Employment and the Environment is a book on the role of transfers designed for assisting sustainable development of less developed regions within the European Union. The book places special emphasis on the future path of the Greek economy and discusses likely outcomes -related directly to the impact of these transfers- in: * Growth and macroeconomic convergence * Employment in key sectors of the economy * Energy demand and its environmental aspect The book uses macroeconomic modelling and modern applied econometric techniques to analyze these issues, thus offering a coherent methodological framework for their presentation.To this extent, Structural Funds: Growth, Employment and the Environment can serve to: * Academic researchers and economists in recipient countries who can gain a better understanding of how national authorities can best design and implement the strategic allocation and utilization of these funds to maximize the benefits for the domestic economy * Policymakers in the European Union by offering a sound and rigorously elaborated treatment which can be applied as an estimation and comparison tool for the effects of Structural Funds both at the national and the international level * Economists in Eastern European countries which are at the pre-accession stage and will be eligible for this type of transfers in the near future.
Financial globalization has increased the significance of methods used in the evaluation of country risk, one of the major research topics in economics and finance. Written by experts in the fields of multicriteria methodology, credit risk assessment, operations research, and financial management, this book develops a comprehensive framework for evaluating models based on several classification techniques that emerge from different theoretical directions. This book compares different statistical and data mining techniques, noting the advantages of each method, and introduces new multicriteria methodologies that are important to country risk modeling. Key topics include: (1) A review of country risk definitions and an overview of the most recent tools in country risk management, (2) In-depth analysis of statistical, econometric and non-parametric classification techniques, (3) Several real-world applications of the methodologies described throughout the text, (4) Future research directions for country risk assessment problems. This work is a useful toolkit for economists, financial managers, bank managers, operations researchers, management scientists, and risk analysts. Moreover, the book can also be used as a supplementary text for graduate courses in finance and financial risk management.
An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. Design of Observational Studies is divided into four parts. Chapters 2, 3, and 5 of Part I cover concisely, in about one hundred pages, many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates. Part II includes a chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies, "make your theories elaborate." The second edition of his book, Observational Studies, was published by Springer in 2002.
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
This book covers recent advances in efficiency evaluations, most notably Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) methods. It introduces the underlying theories, shows how to make the relevant calculations and discusses applications. The aim is to make the reader aware of the pros and cons of the different methods and to show how to use these methods in both standard and non-standard cases. Several software packages have been developed to solve some of the most common DEA and SFA models. This book relies on R, a free, open source software environment for statistical computing and graphics. This enables the reader to solve not only standard problems, but also many other problem variants. Using R, one can focus on understanding the context and developing a good model. One is not restricted to predefined model variants and to a one-size-fits-all approach. To facilitate the use of R, the authors have developed an R package called Benchmarking, which implements the main methods within both DEA and SFA. The book uses mathematical formulations of models and assumptions, but it de-emphasizes the formal proofs - in part by placing them in appendices -- or by referring to the original sources. Moreover, the book emphasizes the usage of the theories and the interpretations of the mathematical formulations. It includes a series of small examples, graphical illustrations, simple extensions and questions to think about. Also, it combines the formal models with less formal economic and organizational thinking. Last but not least it discusses some larger applications with significant practical impacts, including the design of benchmarking-based regulations of energy companies in different European countries, and the development of merger control programs for competition authorities.
This book links the questions people ask about why things exist, why the world is the way it is, and whether and how it is possible to change their society or world with the societal myths they develop and teach to answer those questions and organize and bring order to their communal lives. It also is about the need for change in western societies’ current organizing concept, classical (Lockean) liberalism. Despite the attempts of numerous insightful political thinkers, the myth of classical liberalism has developed so many cracks that it cannot be put back together again. If not entirely failed, it is at this point unsalvageable in its present form. Never the thought of just one person, the liberal model of individual religious, political, and economic freedom developed over hundreds of years starting with Martin Luther’s dictum that every man should be his own priest. Although, classical liberalism means different things to different people, at its most basic level, this model sees human beings as individuals who exist prior to government and have rights over government and the social good. That is, the individual right always trumps the moral and social good and individuals have few obligations to one another unless they actively choose to undertake them. Possibility’s Parents argues that Lockean liberalism has reached the end of its logic in ways that make it unable to handle the western world’s most pressing problems and that novelists whose writing includes the form and texture of myth have important insights to offer on the way forward.
Getting Started with a SIMPLIS Approach is particularly appropriate for those users who are not experts in statistics, but have a basic understanding of multivariate analysis that would allow them to use this handbook as a good first foray into LISREL. Part I introduces the topic, presents the study that serves as the background for the explanation of matters, and provides the basis for Parts II and III, which, in turn, explain the process of estimation of the measurement model and the structural model, respectively. In each section, we also suggest essential literature to support the utilization of the handbook. After having read the book, readers will have acquired a basic knowledge of structural equation modeling, namely using the LISREL program, and will be prepared to continue with the learning process."
Articles on econometric methodology with special reference to the
quantification of poverty and economic inequality are presented in
this book. Poverty and inequality measurement present special
problems to the econometrician, and most of these papers analyze
how to attack those problems.
Formal decision and evaluation models are so widespread that almost no one can pretend not to have used or suffered the consequences of one of them. This book is a guide aimed at helping the analyst to choose a model and use it consistently. A sound analysis of techniques is proposed and the presentation can be extended to most decision and evaluation models as a "decision aiding methodology."
This book contains some of the results from the research project "Demand for Food in the Nordic Countries," which was initiated in 1988 by Professor Olof Bolin of the Agricultural University in Ultuna, Sweden and by Professor Karl Iohan Weckman, of the University of Helsinki, Finland. A pilot study was carried out by Bengt Assarsson, which in 1989 led to a successful application for a research grant from the NKJ (The Nordic Contact Body for Agricultural Research) through the national research councils for agricultural research in Denmark, Finland, Norway and Sweden. We are very grateful to Olof Bolin and Karl Iohan Weckman, without whom this project would not have come about, and to the national research councils in the Nordic countries for the generous financial support we have received for this project. We have received comments and suggestions from many colleagues, and this has improved our work substantially. At the start of the project a reference group was formed, consisting of Professor Olof Bolin, Professor Anders Klevmarken, Agr. lie. Gert Aage Nielsen, Professor Karl Iohan Weckman and Cando oecon. Per Halvor Vale. Gert Aage Nielsen left the group early in the project for a position in Landbanken, and was replaced by Professor Lars Otto, while Per Halvor Vale soon joined the research staff. The reference group has given us useful suggestions and encouraged us in our work. Weare very grateful to them. |
You may like...
Macroeconomics and the Real World…
Roger E. Backhouse, Andrea Salanti
Hardcover
R4,296
Discovery Miles 42 960
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
|