![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
The book deals with collusion between firms on both sides of a market that is immune to deviations by coalitions. We study this issue using an infinitely countably repeated game with discounting of future single period payoffs. A strict strong perfect equilibrium is the main solution concept that we apply. It requires that no coalition of players in no subgame can weakly Pareto improve the vector of continuation average discounted payoffs of its members by a deviation. If the sum of firms' average discounted profits is maximized along the equilibrium path then the equilibrium output of each type of good is produced with the lowest possible costs. If, in addition, all buyers are retailers (i.e., they resell the goods purchased in the analyzed market in a retail market) then the equilibrium vector of the quantities sold in the retail market is sold with the lowest possible selling costs. We specify sufficient conditions under which collusion increases consumer welfare.
Financial globalization has increased the significance of methods used in the evaluation of country risk, one of the major research topics in economics and finance. Written by experts in the fields of multicriteria methodology, credit risk assessment, operations research, and financial management, this book develops a comprehensive framework for evaluating models based on several classification techniques that emerge from different theoretical directions. This book compares different statistical and data mining techniques, noting the advantages of each method, and introduces new multicriteria methodologies that are important to country risk modeling. Key topics include: (1) A review of country risk definitions and an overview of the most recent tools in country risk management, (2) In-depth analysis of statistical, econometric and non-parametric classification techniques, (3) Several real-world applications of the methodologies described throughout the text, (4) Future research directions for country risk assessment problems. This work is a useful toolkit for economists, financial managers, bank managers, operations researchers, management scientists, and risk analysts. Moreover, the book can also be used as a supplementary text for graduate courses in finance and financial risk management.
An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. Design of Observational Studies is divided into four parts. Chapters 2, 3, and 5 of Part I cover concisely, in about one hundred pages, many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates. Part II includes a chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies, "make your theories elaborate." The second edition of his book, Observational Studies, was published by Springer in 2002.
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
This book provides cutting-edge research results and application experiencesfrom researchers and practitioners in multiple criteria decision making areas. It consists of three parts: MCDM Foundation and Theory, MCDM Methodology, and MCDM Applications. In Part I, it covers the historical MCDM development, the influence of MCDM on technology, society and policy, Pareto optimization, and analytical hierarchy process. In Part II, the book presents different MCDM algorithms based on techniques of robust estimating, evolutionary multiobjective optimization, Choquet integrals, and genetic search. In Part III, this book demonstrates a variety of MCDM applications, including project management, financial investment, credit risk analysis, railway transportation, online advertising, transport infrastructure, environmental pollution, chemical industry, and regional economy. The 17 papers of the book have been selected out of the 121 accepted papers at the 20th International Conference on Multiple Criteria Decision Making "New State of MCDM in 21st Century," held at Chengdu, China, in 2009. The 35 contributors of these papers stem from 10 countries."
The estimation and the validation of the Basel II risk parameters PD (default probability), LGD (loss given fault), and EAD (exposure at default) is an important problem in banking practice. These parameters are used on the one hand as inputs to credit portfolio models and in loan pricing frameworks, on the other to compute regulatory capital according to the new Basel rules. This book covers the state-of-the-art in designing and validating rating systems and default probability estimations. Furthermore, it presents techniques to estimate LGD and EAD and includes a chapter on stress testing of the Basel II risk parameters. The second edition is extended by three chapters explaining how the Basel II risk parameters can be used for building a framework for risk-adjusted pricing and risk management of loans.
To derive rational and convincible solutions to practical decision making problems in complex and hierarchical human organizations, the decision making problems are formulated as relevant mathematical programming problems which are solved by developing optimization techniques so as to exploit characteristics or structural features of the formulated problems. In particular, for resolving con?ict in decision making in hierarchical managerial or public organizations, the multi level formula tion of the mathematical programming problems has been often employed together with the solution concept of Stackelberg equilibrium. However, weconceivethatapairoftheconventionalformulationandthesolution concept is not always suf?cient to cope with a large variety of decision making situations in actual hierarchical organizations. The following issues should be taken into consideration in expression and formulation of decision making problems. Informulationofmathematicalprogrammingproblems, itistacitlysupposedthat decisions are made by a single person while game theory deals with economic be havior of multiple decision makers with fully rational judgment. Because two level mathematical programming problems are interpreted as static Stackelberg games, multi level mathematical programming is relevant to noncooperative game theory; in conventional multi level mathematical programming models employing the so lution concept of Stackelberg equilibrium, it is assumed that there is no communi cation among decision makers, or they do not make any binding agreement even if there exists such communication. However, for decision making problems in such as decentralized large ?rms with divisional independence, it is quite natural to sup pose that there exists communication and some cooperative relationship among the decision maker
This book explains in simple settings the fundamental ideas of financial market modelling and derivative pricing, using the no-arbitrage principle. Relatively elementary mathematics leads to powerful notions and techniques - such as viability, completeness, self-financing and replicating strategies, arbitrage and equivalent martingale measures - which are directly applicable in practice. The general methods are applied in detail to pricing and hedging European and American options within the Cox-Ross-Rubinstein (CRR) binomial tree model. A simple approach to discrete interest rate models is included, which, though elementary, has some novel features. All proofs are written in a user-friendly manner, with each step carefully explained and following a natural flow of thought. In this way the student learns how to tackle new problems.
This book will interest and assist people who are dealing with the problems of predicitons of time series in higher education and research. It will greatly assist people who apply time series theory to practical problems in their work and also serve as a textbook for postgraduate students in statistics economics and related subjects.
The optimisation of economic systems over time, and in an uncertain environment, is central to the study of economic behaviour. The behaviour of rational decision makers, whether they are market agents, firms, or governments and their agencies, is governed by decisions designed to seeure the best outcomes subject to the perceived information and economic responses (inlcuding those of other agents). Economic behaviour has therefore to be analysed in terms of the outcomes of a multiperiod stochastic optimisation process containing four main components: the economic responses (the dynamic constraints, represented by an economic model); the objec tive function (the goals and their priorities); the conditioning information (expected exogenous events and the expected future state of the economy); and risk manage ment (how uncertainties are accommodated). The papers presented in this book all analyse some aspect of economic behaviour related to the objectives, information, or risk components of the decision process. While the construction of economic models obviously also has a vital role to play, that component has received much greater (or almost exclusive) attention elsewhere. These papers examine optimising behaviour in a wide range of economic problems, both theoretical and applied. They reflect a variety of concerns: economic responses under rational expectations; the Lucas critique and optimal fiscal or monetary poli eies; market management; partly endogenous goals; evaluating government reactions; locational decisions; uncertainty and information structures; and forecasting with endogenous reactions."
Getting Started with a SIMPLIS Approach is particularly appropriate for those users who are not experts in statistics, but have a basic understanding of multivariate analysis that would allow them to use this handbook as a good first foray into LISREL. Part I introduces the topic, presents the study that serves as the background for the explanation of matters, and provides the basis for Parts II and III, which, in turn, explain the process of estimation of the measurement model and the structural model, respectively. In each section, we also suggest essential literature to support the utilization of the handbook. After having read the book, readers will have acquired a basic knowledge of structural equation modeling, namely using the LISREL program, and will be prepared to continue with the learning process."
This book provides a game theoretic model of interaction among VoIP telecommunications providers regarding their willingness to enter peering agreements with one another. The author shows that the incentive to peer is generally based on savings from otherwise payable long distance fees. At the same time, termination fees can have a countering and dominant effect, resulting in an environment in which VoIP firms decide against peering. Various scenarios of peering and rules for allocation of the savings are considered. The first part covers the relevant aspects of game theory and network theory, trying to give an overview of the concepts required in the subsequent application. The second part of the book introduces first a model of how the savings from peering can be calculated and then turns to the actual formation of peering relationships between VoIP firms. The conditions under which firms are willing to peer are then described, considering the possible influence of a regulatory body.
Since its establishment in the 1950s the American Economic Association's Committee on Economic Education has sought to promote improved instruction in economics and to facilitate this objective by stimulating research on the teaching of economics. These efforts are most apparent in the sessions on economic education that the Committee organizes at the Association's annual meetings. At these sessions economists interested in economic education have opportunities to present new ideas on teaching and research and also to report the findings of their research. The record of this activity can be found in the Proceedings of the American Eco nomic Review. The Committee on Economic Education and its members have been actively involved in a variety of other projects. In the early 1960s it organized the National Task Force on Economic Education that spurred the development of economics teaching at the precollege level. This in turn led to the development of a standardized research instrument, a high school test of economic understanding. This was followed later in the 1960s by the preparation of a similar test of understanding college economics. The development of these two instruments greatly facilitated research on the impact of economics instruction, opened the way for application of increasingly sophisticated statistical methods in measuring the impact of economic education, and initiated a steady stream of research papers on a subject that previously had not been explored."
Articles on econometric methodology with special reference to the
quantification of poverty and economic inequality are presented in
this book. Poverty and inequality measurement present special
problems to the econometrician, and most of these papers analyze
how to attack those problems.
This work grew out of a series of investigations begun by the authors in 1980 and 1981. Specifically the authors pursued two lines of inquiry. First, to advance the state of the theoretical lit- erature to better explain the crises of liberalization which seemed to be afflicting the third world in general and Latin America in particular. To do this, several different kinds of models were in- vestigated and adapted. These are presented in Chapters 2, 3 and 5. Secondly an analysis of the empirical evidence was conducted in order to gain insight into the processes that were thought to be occurring and the theoretical models that were being developed. Some of this work appears in Chapters 3, 4, 5 and 6. Other work by the authors on these issues has been published elsewhere and is referenced herein. There are a great many people whose work and whose com- ments have influenced this work. We would like to especially thank Guillermo Calvo, Michael Connolly, Sebastian Edwards, Roque Fernandez, Michael Darby, Robert Clower, Neil Wallace, John Kareken, Paul McNelis, Jeffrey Nugent, Jaime Marquez, Lee Ohanian, Leroy Laney, Jorge Braga de Macedo, Dale Henderson, vii Matthew Canzoneiri, Arthur Laffer, Marc Miles, and George Von Furstenberg whose ideas and comments gave rise to much of our work. We would like to thank Suh Lee for his assistance with the computations in Chapter 5.
This book contains some of the results from the research project "Demand for Food in the Nordic Countries," which was initiated in 1988 by Professor Olof Bolin of the Agricultural University in Ultuna, Sweden and by Professor Karl Iohan Weckman, of the University of Helsinki, Finland. A pilot study was carried out by Bengt Assarsson, which in 1989 led to a successful application for a research grant from the NKJ (The Nordic Contact Body for Agricultural Research) through the national research councils for agricultural research in Denmark, Finland, Norway and Sweden. We are very grateful to Olof Bolin and Karl Iohan Weckman, without whom this project would not have come about, and to the national research councils in the Nordic countries for the generous financial support we have received for this project. We have received comments and suggestions from many colleagues, and this has improved our work substantially. At the start of the project a reference group was formed, consisting of Professor Olof Bolin, Professor Anders Klevmarken, Agr. lie. Gert Aage Nielsen, Professor Karl Iohan Weckman and Cando oecon. Per Halvor Vale. Gert Aage Nielsen left the group early in the project for a position in Landbanken, and was replaced by Professor Lars Otto, while Per Halvor Vale soon joined the research staff. The reference group has given us useful suggestions and encouraged us in our work. Weare very grateful to them.
Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis. Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order.
This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.
Both in insurance and in finance applications, questions involving extremal events (such as large insurance claims, large fluctuations in financial data, stock market shocks, risk management, ...) play an increasingly important role. This book sets out to bridge the gap between the existing theory and practical applications both from a probabilistic as well as from a statistical point of view. Whatever new theory is presented is always motivated by relevant real-life examples. The numerous illustrations and examples, and the extensive bibliography make this book an ideal reference text for students, teachers and users in the industry of extremal event methodology.
Formal decision and evaluation models are so widespread that almost no one can pretend not to have used or suffered the consequences of one of them. This book is a guide aimed at helping the analyst to choose a model and use it consistently. A sound analysis of techniques is proposed and the presentation can be extended to most decision and evaluation models as a "decision aiding methodology."
The origins of this volume can be traced back to a conference on "Ethics, Economic and Business" organized by Columbia Busi ness School in March of 1993, and held in the splendid facilities of Columbia's Casa Italiana. Preliminary versions of several of the papers were presented at that meeting. In July 1994 the Fields Institute of Mathematical Sciences sponsored a workshop on "Geometry, Topology and Markets" additional papers and more refined versions of the original papers were presented there. They were published in their present versions in Social Choice and Wel fare, volume 14, number 2, 1997. The common aim of these workshops and this volume is to crystallize research in an area which has emerged rapidly in the last fifteen years, the area of topological approaches to social choice and the theory of games. The area is attracting increasing interest from social choice theorists, game theorists, mathematical econ omists and mathematicians, yet there is no authoritative collection of papers in the area. Nor is there any surveyor book to give a perspective and act as a guide to the issues in and contributions to this new area. One of the two aims of this volume is in some measure to play this role: the other aim is of course to present interesting and surprising new results."
The Econometric Analysis of Network Data serves as an entry point for advanced students, researchers, and data scientists seeking to perform effective analyses of networks, especially inference problems. It introduces the key results and ideas in an accessible, yet rigorous way. While a multi-contributor reference, the work is tightly focused and disciplined, providing latitude for varied specialties in one authorial voice.
Financial econometrics combines mathematical and statistical theory and techniques to understand and solve problems in financial economics. Modeling and forecasting financial time series, such as prices, returns, interest rates, financial ratios, and defaults, are important parts of this field. In Financial Econometrics, you'll be introduced to this growing discipline and the concepts associated with it--from background material on probability theory and statistics to information regarding the properties of specific models and their estimation procedures. With this book as your guide, you'll become familiar with: Autoregressive conditional heteroskedasticity (ARCH) and GARCH modeling Principal components analysis (PCA) and factor analysis Stable processes and ARMA and GARCH models with fat-tailed errors Robust estimation methods Vector autoregressive and cointegrated processes, including advanced estimation methods for cointegrated systems And much more The experienced author team of Svetlozar Rachev, Stefan Mittnik, Frank Fabozzi, Sergio Focardi, and Teo Jasic not only presents you with an abundant amount of information on financial econometrics, but they also walk you through a wide array of examples to solidify your understanding of the issues discussed. Filled with in-depth insights and expert advice, Financial Econometrics provides comprehensive coverage of this discipline and clear explanations of how the models associated with it fit into today's investment management process.
Survival analysis is a highly active area of research with applications spanning the physical, engineering, biological, and social sciences. In addition to statisticians and biostatisticians, researchers in this area include epidemiologists, reliability engineers, demographers and economists. The economists survival analysis by the name of duration analysis and the analysis of transition data. We attempted to bring together leading researchers, with a common interest in developing methodology in survival analysis, at the NATO Advanced Research Workshop. The research works collected in this volume are based on the presentations at the Workshop. Analysis of survival experiments is complicated by issues of censoring, where only partial observation of an individual's life length is available and left truncation, where individuals enter the study group if their life lengths exceed a given threshold time. Application of the theory of counting processes to survival analysis, as developed by the Scandinavian School, has allowed for substantial advances in the procedures for analyzing such experiments. The increased use of computer intensive solutions to inference problems in survival analysis~ in both the classical and Bayesian settings, is also evident throughout the volume. Several areas of research have received special attention in the volume.
|
![]() ![]() You may like...
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,881
Discovery Miles 38 810
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,736
Discovery Miles 67 360
The Multi-Agent Transport Simulation…
Andreas Horni, Kai Nagel, …
Hardcover
R1,767
Discovery Miles 17 670
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,790
Discovery Miles 37 900
|