![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
o. Guvenen, University of Paris IX-Dauphine The aim of this publication is to present recent developments in international com modity market model building and policy analysis. This book is based mainly on the research presented at the XlIth International Conference organised by the Applied Econometric Association (AEA) which was held at the University of Zaragoza in Spain. This conference would not have been possible with out the cooperation of the Department of Econometrics of the University of Zaragoza and its Chairman A.A. Grasa. I would like to express my thanks to all contributors. I am grateful to J.H.P. Paelinck, J.P. Ancot, A.J. Hughes Hallett and H. Serbat for their constructive contributions and comments concerning the structure of the book. vii INTRODUCTION o. Guvenen The challenge of increasing complexity and global interdependence at the world level necessitates new modelling approaches and policy analysis at the macroeconomic level, and for commodities. The evaluation of economic modelling.follows the evolution of international economic phenomena. In that interdependent context there is a growing need for forecasting and simulation tools in the analysis of international primary com modity markets."
These proceedings, from a conference held at the Federal Reserve Bank of St. Louis on October 17-18, 1991, attempted to layout what we currently know about aggregate economic fluctuations. Identifying what we know inevitably reveals what we do not know about such fluctuations as well. From the vantage point of where the conference's participants view our current understanding to be, these proceedings can be seen as suggesting an agenda for further research. The conference was divided into five sections. It began with the formu lation of an empirical definition of the "business cycle" and a recitation of the stylized facts that must be explained by any theory that purports to capture the business cycle's essence. After outlining the historical develop ment and key features of the current "theories" of business cycles, the conference evaluated these theories on the basis of their ability to explain the facts. Included in this evaluation was a discussion of whether (and how) the competing theories could be distinguished empirically. The conference then examined the implications for policy of what is known and not known about business cycles. A panel discussion closed the conference, high lighting important unresolved theoretical and empirical issues that should be taken up in future business cycle research. What Is a Business Cycle? Before gaining a genuine understanding of business cycles, economists must agree and be clear about what they mean when they refer to the cycle."
Max-Min problems are two-step allocation problems in which one side must make his move knowing that the other side will then learn what the move is and optimally counter. They are fundamental in parti cular to military weapons-selection problems involving large systems such as Minuteman or Polaris, where the systems in the mix are so large that they cannot be concealed from an opponent. One must then expect the opponent to determine on an optlmal mixture of, in the case men tioned above, anti-Minuteman and anti-submarine effort. The author's first introduction to a problem of Max-Min type occurred at The RAND Corporation about 1951. One side allocates anti-missile defenses to various cities. The other side observes this allocation and then allocates missiles to those cities. If F(x, y) denotes the total residual value of the cities after the attack, with x denoting the defender's strategy and y the attacker's, the problem is then to find Max MinF(x, y) = Max MinF(x, y)] ."
In this Element and its accompanying second Element, A Practical Introduction to Regression Discontinuity Designs: Extensions, Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an accessible and practical guide for the analysis and interpretation of regression discontinuity (RD) designs that encourages the use of a common set of practices and facilitates the accumulation of RD-based empirical evidence. In this Element, the authors discuss the foundations of the canonical Sharp RD design, which has the following features: (i) the score is continuously distributed and has only one dimension, (ii) there is only one cutoff, and (iii) compliance with the treatment assignment is perfect. In the second Element, the authors discuss practical and conceptual extensions to this basic RD setup.
Macroeconomic Policy in the Canadian Economy investigates developments in Canada over the last forty years, using recent advances in the field of applied econometrics. In particular, the book analyzes the theoretical foundations of public sector activities and evaluates the several theories of government growth. Issues of convergence are also investigated as they manifest themselves in per capita income across Canadian provinces, and as to how successful government income equalization policies have been in furthering such convergence. Moreover, the openness of the Canadian economy is investigated in terms of the importance of exports on GDP growth and of its participation in the world of an internationally integrated capital market. The book also analyzes monetary policy issues and investigates the role of monetary aggregates and the effectiveness of monetary policy. Finally, it addresses the issue of the existence or not of electoral and partisan cycles in Canada, by incorporating both fiscal and monetary principles and applying them to the lively world of Canadian politics.
Empirical Studies In Applied Economics presents nine previously unpublished analyses in monograph form. In this work, the topics are presented so that each chapter stands on its own. The emphasis is on the applications but attention is also given to the econometric and statistical issues for advanced readers. Econometric methods include multivariate regression analysis, limited dependent variable analysis, and other maximum likelihood techniques. The empirical topics include the measurement of competition and market power in natural gas transportation markets and in the pharmaceutical market for chemotherapy drugs. Additional topics include an empirical analysis of NFL football demand, the accuracy of an econometric model for mail demand, and the allocation of police services in rural Alaska. Other chapters consider the valuation of technology patents and the determination of patent scope, duration, and reasonable royalty, and the reaction of financial markets to health scares in the fast-food industry. Finally, two chapters are devoted to the theory and testing of synergistic health effects from the combined exposure to asbestos and cigarette smoking.
Written to complement the second edition of best-selling textbook Introductory Econometrics for Finance, this book provides a comprehensive introduction to the use of the Regression Analysis of Time Series (RATS) software for modelling in finance and beyond. It provides numerous worked examples with carefully annotated code and detailed explanations of the outputs, giving readers the knowledge and confidence to use the software for their own research and to interpret their own results. A wide variety of important modelling approaches are covered, including such topics as time-series analysis and forecasting, volatility modelling, limited dependent variable and panel methods, switching models and simulations methods. The book is supported by an accompanying website containing freely downloadable data and RATS instructions.
Commerce, Complexity, and Evolution is a significant contribution to the paradigm - straddling economics, finance, marketing, and management - which acknowledges that commercial systems are evolutionary, and must therefore be analysed using evolutionary tools. Evolutionary systems display complicated behaviours which are to a significant degree generated endogenously, rather than being solely the product of exogenous shocks, hence the conjunction of complexity with evolution. This volume considers a wide range of systems, from the entire economy at one extreme to the behaviour of single markets at the other. The papers are united by methodologies which at their core are evolutionary, though the techniques cover a wide range, from philosophical discourse to differential equations, genetic algorithms, multi-agent simulations and cellular automata. Issues considered include the dynamics of debt-deflation, stock management in a complex environment, interactions between consumers and its effect upon market behaviour, and nonlinear methods to profit from financial market volatility.
Written to complement the second edition of best-selling textbook Introductory Econometrics for Finance, this book provides a comprehensive introduction to the use of the Regression Analysis of Time Series (RATS) software for modelling in finance and beyond. It provides numerous worked examples with carefully annotated code and detailed explanations of the outputs, giving readers the knowledge and confidence to use the software for their own research and to interpret their own results. A wide variety of important modelling approaches are covered, including such topics as time-series analysis and forecasting, volatility modelling, limited dependent variable and panel methods, switching models and simulations methods. The book is supported by an accompanying website containing freely downloadable data and RATS instructions.
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
The three decades which have followed the publication of Heinz Neudecker's seminal paper Some Theorems on Matrix Differentiation with Special Reference to Kronecker Products' in the Journal of the American Statistical Association (1969) have witnessed the growing influence of matrix analysis in many scientific disciplines. Amongst these are the disciplines to which Neudecker has contributed directly - namely econometrics, economics, psychometrics and multivariate analysis. This book aims to illustrate how powerful the tools of matrix analysis have become as weapons in the statistician's armoury. The majority of its chapters are concerned primarily with theoretical innovations, but all of them have applications in view, and some of them contain extensive illustrations of the applied techniques. This book will provide research workers and graduate students with a cross-section of innovative work in the fields of matrix methods and multivariate statistical analysis. It should be of interest to students and practitioners in a wide range of subjects which rely upon modern methods of statistical analysis. The contributors to the book are themselves practitioners of a wide range of subjects including econometrics, psychometrics, educational statistics, computation methods and electrical engineering, but they find a common ground in the methods which are represented in the book. It is envisaged that the book will serve as an important work of reference and as a source of inspiration for some years to come.
Structural Funds: Growth, Employment and the Environment is a book on the role of transfers designed for assisting sustainable development of less developed regions within the European Union. The book places special emphasis on the future path of the Greek economy and discusses likely outcomes -related directly to the impact of these transfers- in: * Growth and macroeconomic convergence * Employment in key sectors of the economy * Energy demand and its environmental aspect The book uses macroeconomic modelling and modern applied econometric techniques to analyze these issues, thus offering a coherent methodological framework for their presentation.To this extent, Structural Funds: Growth, Employment and the Environment can serve to: * Academic researchers and economists in recipient countries who can gain a better understanding of how national authorities can best design and implement the strategic allocation and utilization of these funds to maximize the benefits for the domestic economy * Policymakers in the European Union by offering a sound and rigorously elaborated treatment which can be applied as an estimation and comparison tool for the effects of Structural Funds both at the national and the international level * Economists in Eastern European countries which are at the pre-accession stage and will be eligible for this type of transfers in the near future.
Investment is crucial to the development of a nations economy and welfare. In contrast to the situation in the United States, investment activity in Europe has been quite modest over the past few years. This volume gathers together a number of papers by prominent researchers in the field of investment. It provides an overview of recent developments in this area and presents new empirical findings on the determinants and implications of the investment process in European countries. Among the topics examined are the role played by taxation, uncertainty and the financial systems, as well as the relevance of corporate governance to the investment process. Two chapters are dedicated to infrastructure investment and foreign direct investment.
Financial globalization has increased the significance of methods used in the evaluation of country risk, one of the major research topics in economics and finance. Written by experts in the fields of multicriteria methodology, credit risk assessment, operations research, and financial management, this book develops a comprehensive framework for evaluating models based on several classification techniques that emerge from different theoretical directions. This book compares different statistical and data mining techniques, noting the advantages of each method, and introduces new multicriteria methodologies that are important to country risk modeling. Key topics include: (1) A review of country risk definitions and an overview of the most recent tools in country risk management, (2) In-depth analysis of statistical, econometric and non-parametric classification techniques, (3) Several real-world applications of the methodologies described throughout the text, (4) Future research directions for country risk assessment problems. This work is a useful toolkit for economists, financial managers, bank managers, operations researchers, management scientists, and risk analysts. Moreover, the book can also be used as a supplementary text for graduate courses in finance and financial risk management.
An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. Design of Observational Studies is divided into four parts. Chapters 2, 3, and 5 of Part I cover concisely, in about one hundred pages, many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates. Part II includes a chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies, "make your theories elaborate." The second edition of his book, Observational Studies, was published by Springer in 2002.
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
In this testament to the distinguished career of H.S. Houthakker a number of Professor Houthakker's friends, former colleagues and former students offer essays which build upon and extend his many contributions to economics in aggregation, consumption, growth and trade. Among the many distinguished contributors are Paul Samuelson, Werner Hildenbrand, John Muellbauer and Lester Telser. The book also includes four previously unpublished papers and notes by its distinguished dedicatee.
To derive rational and convincible solutions to practical decision making problems in complex and hierarchical human organizations, the decision making problems are formulated as relevant mathematical programming problems which are solved by developing optimization techniques so as to exploit characteristics or structural features of the formulated problems. In particular, for resolving con?ict in decision making in hierarchical managerial or public organizations, the multi level formula tion of the mathematical programming problems has been often employed together with the solution concept of Stackelberg equilibrium. However, weconceivethatapairoftheconventionalformulationandthesolution concept is not always suf?cient to cope with a large variety of decision making situations in actual hierarchical organizations. The following issues should be taken into consideration in expression and formulation of decision making problems. Informulationofmathematicalprogrammingproblems, itistacitlysupposedthat decisions are made by a single person while game theory deals with economic be havior of multiple decision makers with fully rational judgment. Because two level mathematical programming problems are interpreted as static Stackelberg games, multi level mathematical programming is relevant to noncooperative game theory; in conventional multi level mathematical programming models employing the so lution concept of Stackelberg equilibrium, it is assumed that there is no communi cation among decision makers, or they do not make any binding agreement even if there exists such communication. However, for decision making problems in such as decentralized large ?rms with divisional independence, it is quite natural to sup pose that there exists communication and some cooperative relationship among the decision maker
The optimisation of economic systems over time, and in an uncertain environment, is central to the study of economic behaviour. The behaviour of rational decision makers, whether they are market agents, firms, or governments and their agencies, is governed by decisions designed to seeure the best outcomes subject to the perceived information and economic responses (inlcuding those of other agents). Economic behaviour has therefore to be analysed in terms of the outcomes of a multiperiod stochastic optimisation process containing four main components: the economic responses (the dynamic constraints, represented by an economic model); the objec tive function (the goals and their priorities); the conditioning information (expected exogenous events and the expected future state of the economy); and risk manage ment (how uncertainties are accommodated). The papers presented in this book all analyse some aspect of economic behaviour related to the objectives, information, or risk components of the decision process. While the construction of economic models obviously also has a vital role to play, that component has received much greater (or almost exclusive) attention elsewhere. These papers examine optimising behaviour in a wide range of economic problems, both theoretical and applied. They reflect a variety of concerns: economic responses under rational expectations; the Lucas critique and optimal fiscal or monetary poli eies; market management; partly endogenous goals; evaluating government reactions; locational decisions; uncertainty and information structures; and forecasting with endogenous reactions."
This book will interest and assist people who are dealing with the problems of predicitons of time series in higher education and research. It will greatly assist people who apply time series theory to practical problems in their work and also serve as a textbook for postgraduate students in statistics economics and related subjects.
This book links the questions people ask about why things exist, why the world is the way it is, and whether and how it is possible to change their society or world with the societal myths they develop and teach to answer those questions and organize and bring order to their communal lives. It also is about the need for change in western societies’ current organizing concept, classical (Lockean) liberalism. Despite the attempts of numerous insightful political thinkers, the myth of classical liberalism has developed so many cracks that it cannot be put back together again. If not entirely failed, it is at this point unsalvageable in its present form. Never the thought of just one person, the liberal model of individual religious, political, and economic freedom developed over hundreds of years starting with Martin Luther’s dictum that every man should be his own priest. Although, classical liberalism means different things to different people, at its most basic level, this model sees human beings as individuals who exist prior to government and have rights over government and the social good. That is, the individual right always trumps the moral and social good and individuals have few obligations to one another unless they actively choose to undertake them. Possibility’s Parents argues that Lockean liberalism has reached the end of its logic in ways that make it unable to handle the western world’s most pressing problems and that novelists whose writing includes the form and texture of myth have important insights to offer on the way forward.
This book provides a game theoretic model of interaction among VoIP telecommunications providers regarding their willingness to enter peering agreements with one another. The author shows that the incentive to peer is generally based on savings from otherwise payable long distance fees. At the same time, termination fees can have a countering and dominant effect, resulting in an environment in which VoIP firms decide against peering. Various scenarios of peering and rules for allocation of the savings are considered. The first part covers the relevant aspects of game theory and network theory, trying to give an overview of the concepts required in the subsequent application. The second part of the book introduces first a model of how the savings from peering can be calculated and then turns to the actual formation of peering relationships between VoIP firms. The conditions under which firms are willing to peer are then described, considering the possible influence of a regulatory body.
Getting Started with a SIMPLIS Approach is particularly appropriate for those users who are not experts in statistics, but have a basic understanding of multivariate analysis that would allow them to use this handbook as a good first foray into LISREL. Part I introduces the topic, presents the study that serves as the background for the explanation of matters, and provides the basis for Parts II and III, which, in turn, explain the process of estimation of the measurement model and the structural model, respectively. In each section, we also suggest essential literature to support the utilization of the handbook. After having read the book, readers will have acquired a basic knowledge of structural equation modeling, namely using the LISREL program, and will be prepared to continue with the learning process." |
You may like...
|