![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This is the second of three volumes containing edited versions of papers and a commentary presented at invited symposium sessions of the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarize and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
The purpose of models is not to fit the data but to sharpen the questions. S. Karlin, 11th R. A. Fisher Memorial Lecture, Royal Society, 20 April 1983 We are proud to offer this volume in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Not one to model solely for the sake of modelling, the above quotation nicely captures Professor Paelinck's unceasing quest for the best question for which an answer is needed. His FLEUR model has sharpened many spatial economics and spatial econometrics questions! Jean Paelinck, arguably, is the founder of modem spatial econometrics, penning the seminal introductory monograph on this topic, Spatial Econometrics, with Klaassen in 1979. In the General Address to the Dutch Statistical Association, on May 2, 1974, in Tilburg, "he coined the term [spatial econometrics] to designate a growing body of the regional science literature that dealt primarily with estimation and testing problems encountered in the implementation of multiregional econometric models" (Anselin, 1988, p. 7); he already had introduced this idea in his introductory report to the 1966 Annual Meeting of the Association de Science Regionale de Langue Fran~aise.
Globalization affects regional economies in a broad spectrum of aspects, from labor market conditions and development policies to climate change. This volume, written by an international cast of eminent regional scientists, provides new tools for analyzing the enormous changes in regional economies due to globalization. It offers timely conceptual refinements for regional analysis.
Nonlinear Econometric Modeling in Time Series presents the more recent literature on nonlinear time series. Specific topics covered with respect to nonlinearity include cointegration tests, risk-related asymmetries, structural breaks and outliers, Bayesian analysis with a threshold, consistency and asymptotic normality, asymptotic inference and error-correction models. With a world-class panel of contributors, this volume addresses topics with major applications for fields such as foreign-exchange markets and interest rate analysis. Eleventh in this series of international symposia, this volume is also part of the European Conference Series in Quantitative Economics and Econometrics (EC)2.
The aim of the book is to provide an overview of risk management in life insurance companies. The focus is twofold: (1) to provide a broad view of the different topics needed for risk management and (2) to provide the necessary tools and techniques to concretely apply them in practice. Much emphasis has been put into the presentation of the book so that it presents the theory in a simple but sound manner. The first chapters deal with valuation concepts which are defined and analysed, the emphasis is on understanding the risks in corresponding assets and liabilities such as bonds, shares and also insurance liabilities. In the following chapters risk appetite and key insurance processes and their risks are presented and analysed. This more general treatment is followed by chapters describing asset risks, insurance risks and operational risks - the application of models and reporting of the corresponding risks is central. Next, the risks of insurance companies and of special insurance products are looked at. The aim is to show the intrinsic risks in some particular products and the way they can be analysed. The book finishes with emerging risks and risk management from a regulatory point of view, the standard model of Solvency II and the Swiss Solvency Test are analysed and explained. The book has several mathematical appendices which deal with the basic mathematical tools, e.g. probability theory, stochastic processes, Markov chains and a stochastic life insurance model based on Markov chains. Moreover, the appendices look at the mathematical formulation of abstract valuation concepts such as replicating portfolios, state space deflators, arbitrage free pricing and the valuation of unit linked products with guarantees. The various concepts in the book are supported by tables and figures.
E. Dijkgraaf and R. H. J. M. Gradus 1. 1 Introduction In 2004 Elbert Dijkgraaf nished a PhD-thesis 'Regulating the Dutch waste market' at the Erasmus University Rotterdam. It was interesting that not much is published about the waste market, although it is a very important sector from an economic and environmental viewpoint. In 2006 we were participants at a very interesting conf- ence on Local Government Reform: privatization and public-private collaboration in Barcelona organized by Germa ` Bel. It was interesting to notice that researchers from Spain, Scandinavian countries, the UK and the USA were studying this issue as well. From this we brought forward the idea to publish a book about the waste market. Because of its legal framework we want to focus on Europe. In this chapter we give an introduction to this book. In the next paragraph we present a short overview of the waste collection market. Since 1960 the importance of the waste sector has increased substantially both in the waste streams and the costs of waste collection and treatment. Furthermore, we discuss policy measures to deal with these increases and give an overview of the different measures in - countries. In the last paragraph we present different chapters of our book. 1. 2 Empirical Update of the Waste Collection Market The Dutch case provides a nice example why studying the waste market is int- esting from an economic point of view.
Spatial Econometrics is a rapidly evolving field born from the joint efforts of economists, statisticians, econometricians and regional scientists. The book provides the reader with a broad view of the topic by including both methodological and application papers. Indeed the application papers relate to a number of diverse scientific fields ranging from hedonic models of house pricing to demography, from health care to regional economics, from the analysis of R&D spillovers to the study of retail market spatial characteristics. Particular emphasis is given to regional economic applications of spatial econometrics methods with a number of contributions specifically focused on the spatial concentration of economic activities and agglomeration, regional paths of economic growth, regional convergence of income and productivity and the evolution of regional employment. Most of the papers appearing in this book were solicited from the International Workshop on Spatial Econometrics and Statistics held in Rome (Italy) in 2006.
This Festschrift is dedicated to Goetz Trenkler on the occasion of his 65th birthday. As can be seen from the long list of contributions, Goetz has had and still has an enormous range of interests, and colleagues to share these interests with. He is a leading expert in linear models with a particular focus on matrix algebra in its relation to statistics. He has published in almost all major statistics and matrix theory journals. His research activities also include other areas (like nonparametrics, statistics and sports, combination of forecasts and magic squares, just to mention afew). Goetz Trenkler was born in Dresden in 1943. After his school years in East G- many and West-Berlin, he obtained a Diploma in Mathematics from Free University of Berlin (1970), where he also discovered his interest in Mathematical Statistics. In 1973, he completed his Ph.D. with a thesis titled: On a distance-generating fu- tion of probability measures. He then moved on to the University of Hannover to become Lecturer and to write a habilitation-thesis (submitted 1979) on alternatives to the Ordinary Least Squares estimator in the Linear Regression Model, a topic that would become his predominant ?eld of research in the years to come.
Environmental risk directly affects the financial stability of banks since they bear the financial consequences of the loss of liquidity of the entities to which they lend and of the financial penalties imposed resulting from the failure to comply with regulations and for actions taken that are harmful to the natural environment. This book explores the impact of environmental risk on the banking sector and analyzes strategies to mitigate this risk with a special emphasis on the role of modelling. It argues that environmental risk modelling allows banks to estimate the patterns and consequences of environmental risk on their operations, and to take measures within the context of asset and liability management to minimize the likelihood of losses. An important role here is played by the environmental risk modelling methodology as well as the software and mathematical and econometric models used. It examines banks' responses to macroprudential risk, particularly from the point of view of their adaptation strategies; the mechanisms of its spread; risk management and modelling; and sustainable business models. It introduces the basic concepts, definitions, and regulations concerning this type of risk, within the context of its influence on the banking industry. The book is primarily based on a quantitative and qualitative approach and proposes the delivery of a new methodology of environmental risk management and modelling in the banking sector. As such, it will appeal to researchers, scholars, and students of environmental economics, finance and banking, sociology, law, and political sciences.
Complex dynamics constitute a growing and increasingly important area as they offer a strong potential to explain and formalize natural, physical, financial and economic phenomena. This book pursues the ambitious goal to bring together an extensive body of knowledge regarding complex dynamics from various academic disciplines. Beyond its focus on economics and finance, including for instance the evolution of macroeconomic growth models towards nonlinear structures as well as signal processing applications to stock markets, fundamental parts of the book are devoted to the use of nonlinear dynamics in mathematics, statistics, signal theory and processing. Numerous examples and applications, almost 700 illustrations and numerical simulations based on the use of Matlab make the book an essential reference for researchers and students from many different disciplines who are interested in the nonlinear field. An appendix recapitulates the basic mathematical concepts required to use the book.
In this book, Professor Thomson and Professor Lensberg extrapolate upon the Nash (1950) treatment of the bargaining problem to consider the situation where the number of bargainers may vary. The authors formulate axioms to specify how solutions should respond to such changes, and provide new characterizations of all the major solutions as well as generalizations of these solutions. The book also contains several other comparative studies of solutions in the context of a variable number of agents. Much of the theory of bargaining can be rewritten within this context. The pre-eminence of the three solutions at the core of the classical theory is confirmed. These are the solutions introducted by Nash (1950) and two solutions axiomatized in the 1970s (Kalai-Smorodinsky and egalitarian solutions).
Many optimization questions arise in economics and finance; an important example of this is the society's choice of the optimum state of the economy (the social choice problem). Optimization in Economics and Finance extends and improves the usual optimization techniques, in a form that may be adopted for modeling social choice problems. Problems discussed include: when is an optimum reached; when is it unique; relaxation of the conventional convex (or concave) assumptions on an economic model; associated mathematical concepts such as invex and quasimax; multiobjective optimal control models; and related computational methods and programs. These techniques are applied to economic growth models (including small stochastic perturbations), finance and financial investment models (and the interaction between financial and production variables), modeling sustainability over long time horizons, boundary (transversality) conditions, and models with several conflicting objectives. Although the applications are general and illustrative, the models in this book provide examples of possible models for a society's social choice for an allocation that maximizes welfare and utilization of resources. As well as using existing computer programs for optimization of models, a new computer program, named SCOM, is presented in this book for computing social choice models by optimal control.
WINNER OF THE 2007 DEGROOT PRIZE The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models. For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods. It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach. The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
"Introductory Econometrics: Intuition, Proof, and Practice"
attempts to distill econometrics into a form that preserves its
essence, but that is acceptable--and even appealing--to the
student's intellectual palate. This book insists on rigor when it
is essential, but it emphasizes intuition and seizes upon
entertainment wherever possible.
This book provides an introduction to the technical background of
unit root testing, one of the most heavily researched areas in
econometrics over the last twenty years. Starting from an
elementary understanding of probability and time series, it
develops the key concepts necessary to understand the structure of
random walks and brownian motion, and their role in tests for a
unit root. The techniques are illustrated with worked examples,
data and programs available on the book's website, which includes
more numerical and theoretical examples
The approach to many problems in economic analysis has changed drastically with the development and dissemination of new and more efficient computational techniques. Computational Economic Systems: Models, Methods & Econometrics presents a selection of papers illustrating the use of new computational methods and computing techniques to solve economic problems. Part I of the volume consists of papers which focus on modelling economic systems, presenting computational methods to investigate the evolution of behavior of economic agents, techniques to solve complex inventory models on a parallel computer and an original approach for the construction and solution of multicriteria models involving logical conditions. Contributions to Part II concern new computational approaches to economic problems. We find an application of wavelets to outlier detection. New estimation algorithms are presented, one concerning seemingly related regression models, a second one on nonlinear rational expectation models and a third one dealing with switching GARCH estimation. Three contributions contain original approaches for the solution of nonlinear rational expectation models.
Survival analysis is a highly active area of research with applications spanning the physical, engineering, biological, and social sciences. In addition to statisticians and biostatisticians, researchers in this area include epidemiologists, reliability engineers, demographers and economists. The economists survival analysis by the name of duration analysis and the analysis of transition data. We attempted to bring together leading researchers, with a common interest in developing methodology in survival analysis, at the NATO Advanced Research Workshop. The research works collected in this volume are based on the presentations at the Workshop. Analysis of survival experiments is complicated by issues of censoring, where only partial observation of an individual's life length is available and left truncation, where individuals enter the study group if their life lengths exceed a given threshold time. Application of the theory of counting processes to survival analysis, as developed by the Scandinavian School, has allowed for substantial advances in the procedures for analyzing such experiments. The increased use of computer intensive solutions to inference problems in survival analysis~ in both the classical and Bayesian settings, is also evident throughout the volume. Several areas of research have received special attention in the volume.
New efficiency theory refers to the various parametric and semi-parametric methods of estimating production and cost frontiers, which include data envelopment analysis (DEA) with its diverse applications in management science and operations research. This monograph develops and generalizes the new efficiency theory by highlighting the interface between economic theory and operations research. Some of the outstanding features of this monograph are: (1) integrating the theory of firm efficiency and industry equilibrium, (2) emphasizing growth efficiency in a dynamic setting, (3) incorporating uncertainty of market demand and prices, and (4) the implications of group efficiency by sharing investments. Applications discuss in some detail the growth and decline of US computer industry, and the relative performance of mutual fund portfolios.
This book proposes a new methodology for the selection of one (model) from among a set of alternative econometric models. Let us recall that a model is an abstract representation of reality which brings out what is relevant to a particular economic issue. An econometric model is also an analytical characterization of the joint probability distribution of some random variables of interest, which yields some information on how the actual economy works. This information will be useful only if it is accurate and precise; that is, the information must be far from ambiguous and close to what we observe in the real world Thus, model selection should be performed on the basis of statistics which summarize the degree of accuracy and precision of each model. A model is accurate if it predicts right; it is precise if it produces tight confidence intervals. A first general approach to model selection includes those procedures based on both characteristics, precision and accuracy. A particularly interesting example of this approach is that of Hildebrand, Laing and Rosenthal (1980). See also Hendry and Richard (1982). A second general approach includes those procedures that use only one of the two dimensions to discriminate among models. In general, most of the tests we are going to examine correspond to this category.
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
This book proposes a uniform logic and probabilistic (LP) approach to risk estimation and analysis in engineering and economics. It covers the methodological and theoretical basis of risk management at the design, test, and operation stages of economic, banking, and engineering systems with groups of incompatible events (GIE). This edition includes new chapters providing a detailed treatment of scenario logic and probabilistic models for revealing bribes. It also contains clear definitions and notations, revised sections and chapters, an extended list of references, and a new subject index, as well as more than a hundred illustrations and tables which motivate the presentation.
During 1985-86, the acquisition editor for the humanities and social sciences division of Kluwer Academic Publishers in the Netherlands visited the University of Horida (where I was also visiting while on sabbatical leave from Wilfrid Laurier University as the McKethan-Matherly Senior Research Fellow) to discuss publishing plans of the faculty. He expressed a keen interest in publishing the proceedings of the conference of the Canadian Econometric Study Group (CESG) that was to be held the following year at WLU. This volume is the end product of his interest, endurance, and persistence. But for his persistence I would have given up on th~ project Most of the papers (though not all) included in this volume are based on presentations at CESG conferences. In some cases scholars were invited to contribute to this volume where their research complimented those presented at these conferences even though they were not conference participants. Since papers selected for presentation at the CESG conferences are generally the finished product of scholarly research and often under submission to refereed journals, it was not possible to publish the conference proceedings in their entirety. Accordingly it was decided, in consultation with the publisher, to invite a select list of authors to submit significant extensions of the papers they presented at the CESG conferences for inclusion in this volume. The editor wishes to express gratitude to all those authors who submitted their papers for evaluation by anonymous referees and for making revisions to conform to our editorial process.
Covers applications to risky assets traded on the markets for
funds, fixed-income products and electricity derivatives.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Gerald P. Dwyer, Jr. and R. W. Hafer The articles and commentaries included in this volume were presented at the Federal Reserve Bank of St. Louis' thirteenth annual economic policy conference, held on October 21-22, 1988. The conference focused on the behavior of asset market prices, a topic of increasing interest to both the popular press and to academic journals as the bull market of the 1980s continued. The events that transpired during October, 1987, both in the United States and abroad, provide an informative setting to test alter native theories. In assembling the papers presented during this conference, we asked the authors to explore the issue of asset pricing and financial market behavior from several vantages. Was the crash evidence of the bursting of a speculative bubble? Do we know enough about the work ings of asset markets to hazard an intelligent guess why they dropped so dramatically in such a brief time? Do we know enough to propose regulatory changes that will prevent any such occurrence in the future, or do we want to even if we can? We think that the articles and commentaries contained in this volume provide significant insight to inform and to answer such questions. The article by Behzad Diba surveys existing theoretical and empirical research on rational bubbles in asset prices." |
You may like...
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Econometrics, Volume 6B
James J. Heckman, Edward Leamer
Hardcover
R3,274
Discovery Miles 32 740
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
The Oxford Handbook of Applied Bayesian…
Anthony O'Hagan, Mike West
Hardcover
R4,188
Discovery Miles 41 880
|