![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
The book deals with collusion between firms on both sides of a market that is immune to deviations by coalitions. We study this issue using an infinitely countably repeated game with discounting of future single period payoffs. A strict strong perfect equilibrium is the main solution concept that we apply. It requires that no coalition of players in no subgame can weakly Pareto improve the vector of continuation average discounted payoffs of its members by a deviation. If the sum of firms' average discounted profits is maximized along the equilibrium path then the equilibrium output of each type of good is produced with the lowest possible costs. If, in addition, all buyers are retailers (i.e., they resell the goods purchased in the analyzed market in a retail market) then the equilibrium vector of the quantities sold in the retail market is sold with the lowest possible selling costs. We specify sufficient conditions under which collusion increases consumer welfare.
Financial globalization has increased the significance of methods used in the evaluation of country risk, one of the major research topics in economics and finance. Written by experts in the fields of multicriteria methodology, credit risk assessment, operations research, and financial management, this book develops a comprehensive framework for evaluating models based on several classification techniques that emerge from different theoretical directions. This book compares different statistical and data mining techniques, noting the advantages of each method, and introduces new multicriteria methodologies that are important to country risk modeling. Key topics include: (1) A review of country risk definitions and an overview of the most recent tools in country risk management, (2) In-depth analysis of statistical, econometric and non-parametric classification techniques, (3) Several real-world applications of the methodologies described throughout the text, (4) Future research directions for country risk assessment problems. This work is a useful toolkit for economists, financial managers, bank managers, operations researchers, management scientists, and risk analysts. Moreover, the book can also be used as a supplementary text for graduate courses in finance and financial risk management.
The aim of the book is to provide an overview of risk management in life insurance companies. The focus is twofold: (1) to provide a broad view of the different topics needed for risk management and (2) to provide the necessary tools and techniques to concretely apply them in practice. Much emphasis has been put into the presentation of the book so that it presents the theory in a simple but sound manner. The first chapters deal with valuation concepts which are defined and analysed, the emphasis is on understanding the risks in corresponding assets and liabilities such as bonds, shares and also insurance liabilities. In the following chapters risk appetite and key insurance processes and their risks are presented and analysed. This more general treatment is followed by chapters describing asset risks, insurance risks and operational risks - the application of models and reporting of the corresponding risks is central. Next, the risks of insurance companies and of special insurance products are looked at. The aim is to show the intrinsic risks in some particular products and the way they can be analysed. The book finishes with emerging risks and risk management from a regulatory point of view, the standard model of Solvency II and the Swiss Solvency Test are analysed and explained. The book has several mathematical appendices which deal with the basic mathematical tools, e.g. probability theory, stochastic processes, Markov chains and a stochastic life insurance model based on Markov chains. Moreover, the appendices look at the mathematical formulation of abstract valuation concepts such as replicating portfolios, state space deflators, arbitrage free pricing and the valuation of unit linked products with guarantees. The various concepts in the book are supported by tables and figures.
Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis. Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order.
The estimation and the validation of the Basel II risk parameters PD (default probability), LGD (loss given fault), and EAD (exposure at default) is an important problem in banking practice. These parameters are used on the one hand as inputs to credit portfolio models and in loan pricing frameworks, on the other to compute regulatory capital according to the new Basel rules. This book covers the state-of-the-art in designing and validating rating systems and default probability estimations. Furthermore, it presents techniques to estimate LGD and EAD and includes a chapter on stress testing of the Basel II risk parameters. The second edition is extended by three chapters explaining how the Basel II risk parameters can be used for building a framework for risk-adjusted pricing and risk management of loans.
This book will interest and assist people who are dealing with the problems of predicitons of time series in higher education and research. It will greatly assist people who apply time series theory to practical problems in their work and also serve as a textbook for postgraduate students in statistics economics and related subjects.
This book is the first volume of the International Series in Economic Model ing, a series designed to summarize current issues and procedures in applied modeling within various fields of economics and to offer new or alternative approaches to prevailing problems. In selecting the subject area for the first volume, we were attracted by the area to which applied modeling efforts are increasingly being drawn, regional economics and its associated subfields. Applied modeling is a broad rubric even when the focus is restricted to econometric modeling issues. Regional econometric modeling has posted a record of rapid growth during the last two decades and has become an established field of research and application. Econometric models of states and large urban areas have become commonplace, but the existence of such models does not signal an end to further development of regional econ ometric methods and models. Many issues such as structural specification, level of geographic detail, data constraints, forecasting integrity, and syn thesis with other regional modeling techniques will continue to be sources of concern and will prompt further research efforts. The chapters of this volume reflect many of these issues. A brief synopsis of each contribution is provided below: Richard Weber offers an overview of regional econometric models by discussing theoretical specification, nature of variables, and ultimate useful ness of such models. For an illustration, Weber describes the specification of the econometric model of New Jersey."
Getting Started with a SIMPLIS Approach is particularly appropriate for those users who are not experts in statistics, but have a basic understanding of multivariate analysis that would allow them to use this handbook as a good first foray into LISREL. Part I introduces the topic, presents the study that serves as the background for the explanation of matters, and provides the basis for Parts II and III, which, in turn, explain the process of estimation of the measurement model and the structural model, respectively. In each section, we also suggest essential literature to support the utilization of the handbook. After having read the book, readers will have acquired a basic knowledge of structural equation modeling, namely using the LISREL program, and will be prepared to continue with the learning process."
The Analytic Network Process (ANP) developed by Thomas Saaty in his work on multicriteria decision making applies network structures with dependence and feedback to complex decision making. This book is a selection of applications of ANP to economic, social and political decisions, and also to technological design. The chapters comprise contributions of scholars, consultants and people concerned about the outcome of certain important decisions who applied the Analytic Network Process to determine the best outcome for each decision from among several potential outcomes. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate, The ANP offers economists a considerably different approach for dealing with economic problems than the usual quantitative models used. The ANP approach is based on absolute scales used to represent pairwise comparison judgments in the context of dominance with respect to a property shared by the homogeneous elements being compared. How much or how many times more does A dominate B with respect to property P? Actually people are able to answer this question by using words to indicate intensity of dominance that all of us are equipped biologically to do all the time (equal, moderate, strong, very strong, and extreme) whose conversion to numbers, validation and extension to inhomogeneous elements form the foundation of the AHP/ANP. Numerous applications of the ANP have been made to economic problems, among which prediction of the turn-around dates for the US economy in the early 1990 s and again in 2001 whose accuracy and validity were both confirmed later in the news. They were based on the process of comparisons of mostly intangible factors rather than on financial, employment and other data and statistics.
Formal decision and evaluation models are so widespread that almost no one can pretend not to have used or suffered the consequences of one of them. This book is a guide aimed at helping the analyst to choose a model and use it consistently. A sound analysis of techniques is proposed and the presentation can be extended to most decision and evaluation models as a "decision aiding methodology."
In recent years there has been a growing interest in and concern for the development of a sound spatial statistical body of theory. This work has been undertaken by geographers, statisticians, regional scientists, econometricians, and others (e. g., sociologists). It has led to the publication of a number of books, including Cliff and Ord's Spatial Processes (1981), Bartlett's The Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial Statistics (1981), Paelinck and Klaassen's Spatial Economet ics (1979), Ahuja and Schachter's Pattern Models (1983), and Upton and Fingleton's Spatial Data Analysis by Example (1985). The first of these books presents a useful introduction to the topic of spatial autocorrelation, focusing on autocorrelation indices and their sampling distributions. The second of these books is quite brief, but nevertheless furnishes an eloquent introduction to the rela tionship between spatial autoregressive and two-dimensional spectral models. Ripley's book virtually ignores autoregressive and trend surface modelling, and focuses almost solely on point pattern analysis. Paelinck and Klaassen's book closely follows an econometric textbook format, and as a result overlooks much of the important material necessary for successful spatial data analy sis. It almost exclusively addresses distance and gravity models, with some treatment of autoregressive modelling. Pattern Models supplements Cliff and Ord's book, which in combination provide a good introduction to spatial data analysis. Its basic limitation is a preoccupation with the geometry of planar patterns, and hence is very narrow in scope."
This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.
Both in insurance and in finance applications, questions involving extremal events (such as large insurance claims, large fluctuations in financial data, stock market shocks, risk management, ...) play an increasingly important role. This book sets out to bridge the gap between the existing theory and practical applications both from a probabilistic as well as from a statistical point of view. Whatever new theory is presented is always motivated by relevant real-life examples. The numerous illustrations and examples, and the extensive bibliography make this book an ideal reference text for students, teachers and users in the industry of extremal event methodology.
Survival analysis is a highly active area of research with applications spanning the physical, engineering, biological, and social sciences. In addition to statisticians and biostatisticians, researchers in this area include epidemiologists, reliability engineers, demographers and economists. The economists survival analysis by the name of duration analysis and the analysis of transition data. We attempted to bring together leading researchers, with a common interest in developing methodology in survival analysis, at the NATO Advanced Research Workshop. The research works collected in this volume are based on the presentations at the Workshop. Analysis of survival experiments is complicated by issues of censoring, where only partial observation of an individual's life length is available and left truncation, where individuals enter the study group if their life lengths exceed a given threshold time. Application of the theory of counting processes to survival analysis, as developed by the Scandinavian School, has allowed for substantial advances in the procedures for analyzing such experiments. The increased use of computer intensive solutions to inference problems in survival analysis~ in both the classical and Bayesian settings, is also evident throughout the volume. Several areas of research have received special attention in the volume.
European Regional Growth is the result of three major influences. First, the ongoing integration of the European regional economies and the need to understand what this means for European economic and social cohesion. Second, the development of geo-economic theories. Third, the development of techniques of spatial data analysis, simulation, data visualization and spatial econometrics. The outcome is a collection of chapters that apply these methods, motivated by a variety of theoretical positions. The book provides powerful and detailed analyses of the causes of income, productivity and employment variations across Europe's regions, and insights into their future prospects.
Economists and psychologists have, on the whole, exhibited sharply different perspectives on the elicitation of preferences. Economists, who have made preference the central primitive in their thinking about human behavior, have for the most part rejected elicitation and have instead sought to infer preferences from observations of choice behavior. Psychologists, who have tended to think of preference as a context-determined subjective construct, have embraced elicitation as their dominant approach to measurement. This volume, based on a symposium organized by Daniel McFadden at the University of California at Berkeley, provides a provocative and constructive engagement between economists and psychologists on the elicitation of preferences.
Markov chains have increasingly become useful way of capturing stochastic nature of many economic and financial variables. Although the hidden Markov processes have been widely employed for some time in many engineering applications e.g. speech recognition, its effectiveness has now been recognized in areas of social science research as well. The main aim of Hidden Markov Models: Applications to Financial Economics is to make such techniques available to more researchers in financial economics. As such we only cover the necessary theoretical aspects in each chapter while focusing on real life applications using contemporary data mainly from OECD group of countries. The underlying assumption here is that the researchers in financial economics would be familiar with such application although empirical techniques would be more traditional econometrics. Keeping the application level in a more familiar level, we focus on the methodology based on hidden Markov processes. This will, we believe, help the reader to develop more in-depth understanding of the modeling issues thereby benefiting their future research.
After Karl Joreskog's first presentation in 1970, Structural Equation Modelling or SEM has become a main statistical tool in many fields of science. It is the standard approach of factor analytic and causal modelling in such diverse fields as sociology, education, psychology, economics, management and medical sciences. In addition to an extension of its application area, Structural Equation Modelling also features a continual renewal and extension of its theoretical background. The sixteen contributions to this book, written by experts from many countries, present important new developments and interesting applications in Structural Equation Modelling. The book addresses methodologists and statisticians professionally dealing with Structural Equation Modelling to enhance their knowledge of the type of models covered and the technical problems involved in their formulation. In addition, the book offers applied researchers new ideas about the use of Structural Equation Modeling in solving their problems. Finally, methodologists, mathematicians and applied researchers alike are addressed, who simply want to update their knowledge of recent approaches in data analysis and mathematical modelling."
Industrial Price, Quantity, and Productivity Indices: The Micro-Economic Theory and an Application gives a comprehensive account of the micro-economic foundations of industrial price, quantity, and productivity indices. The various results available from the literature have been brought together into a consistent framework, based upon modern duality theory. This integration also made it possible to generalize several of these results. Thus, this book will be an important resource for theoretically as well as empirically-oriented researchers who seek to analyse economic problems with the help of index numbers. Although this book's emphasis is on micro-economic theory, it is also intended as a practical guide. A full chapter is therefore devoted to an empirical application. Three different approaches are pursued: a straightforward empirical approach, a non-parametric estimation approach, and a parametric estimation approach. As well as illustrating some of the more important concepts explored in this book, and showing to what extent different computational approaches lead to different outcomes for the same measures, this chapter also makes a powerful case for the use of enterprise micro-data in economic research.
The motive force of human activity that propels the stream of progress is here caught at its source, in its most modest, material expressions. The mechanism of the passions acting as determinant in these low spheres is less complex and can therefore be observed with greater precision. All one need do is leave the picture its clear, calm colors and its simple design. Gradually, as that search for material well-being by which man is tormented grows and expand, it also tends to rise and pursue an ascendant course thorough the social classes. In 'I Malavoglia' it is still only the struggle for material needs. Once these needs are satisfied, the search turns into greed for riches and will be embedded in a bourgeois type . . . Giovanni Verga, from the Introduction to The House by the Medlar Tree (I Malavoglia) Motivation In the past decade, many less developed countries have undertaken structural adjustment programs with the hope of breaking the vicious circle of the depression that enveloped them during the 1980s and of loosening the suffocating grip of the debt crisis. Nearly always, macroeconomic stabilization implies a reduction of public spending and, consequently, a reduction of subsidies on wage goods and food production. Other macro policies, such as tariff elimination and exchange rates alignment, alter relative prices and may have significant effects on the level and distribution of income. Today, poverty and inequality are perceived as economic threats as a result of globalization and unbalanced market expansion.
Given the magnitude of currency speculation and sports gambling, it is surprising that the literature contains mostly negative forecasting results. Majority opinion still holds that short term fluctuations in financial markets follow random walk. In this non-random walk through financial and sports gambling markets, parallels are drawn between modeling short term currency movements and modeling outcomes of athletic encounters. The forecasting concepts and methodologies are identical; only the variables change names. If, in fact, these markets are driven by mechanisms of non-random walk, there must be some explanation for the negative forecasting results. The Analysis of Sports Forecasting: Modeling Parallels Between Sports Gambling and Financial Markets examines this issue.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
Non-Parametric Statistical Diagnosis
Coalition Formation and Social Choice provides a unified and comprehensive study of coalition formation and collective decision-making in committees. It discusses the main existing theories including the size principle, conflict of interest theory, dominant player theory, policy distance theory and power excess theory. In addition, the book offers new theories of coalition formation in which the endogenous formation of preferences for coalitions is basic. Both simple game theory and social choice theory are extensively applied in the treatment of the theories. This combined application not only leads to new theories but also offers a new and fresh perspective on coalition formation and collective decision-making in committees. The book covers the fundamental concepts and results of social choice theory including Arrow's Impossibility Theorem. Furthermore, it gives a coherent treatment of the theory of simple games. Besides more traditional topics in simple game theory like power indices, it also introduces new aspects of simple games such as the Chow parameter, the Chow vector and the notion of similar games. |
![]() ![]() You may like...
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Field Experiments, Volume 1
Esther Duflo, Abhijit Banerjee
Hardcover
R3,541
Discovery Miles 35 410
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
|