![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
The Analytic Network Process (ANP) developed by Thomas Saaty in his work on multicriteria decision making applies network structures with dependence and feedback to complex decision making. This book is a selection of applications of ANP to economic, social and political decisions, and also to technological design. The chapters comprise contributions of scholars, consultants and people concerned about the outcome of certain important decisions who applied the Analytic Network Process to determine the best outcome for each decision from among several potential outcomes. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate, The ANP offers economists a considerably different approach for dealing with economic problems than the usual quantitative models used. The ANP approach is based on absolute scales used to represent pairwise comparison judgments in the context of dominance with respect to a property shared by the homogeneous elements being compared. How much or how many times more does A dominate B with respect to property P? Actually people are able to answer this question by using words to indicate intensity of dominance that all of us are equipped biologically to do all the time (equal, moderate, strong, very strong, and extreme) whose conversion to numbers, validation and extension to inhomogeneous elements form the foundation of the AHP/ANP. Numerous applications of the ANP have been made to economic problems, among which prediction of the turn-around dates for the US economy in the early 1990 s and again in 2001 whose accuracy and validity were both confirmed later in the news. They were based on the process of comparisons of mostly intangible factors rather than on financial, employment and other data and statistics.
The small sample properties of estimators and tests are frequently too complex to be useful or are unknown. Much econometric theory is therefore developed for very large or asymptotic samples where it is assumed that the behaviour of estimators and tests will adequately represent their properties in small samples. Refined asymptotic methods adopt an intermediate position by providing improved approximations to small sample behaviour using asymptotic expansions. Dedicated to the memory of Michael Magdalinos, whose work is a major contribution to this area, this book contains chapters directly concerned with refined asymptotic methods. In addition, there are chapters focussing on new asymptotic results; the exploration through simulation of the small sample behaviour of estimators and tests in panel data models; and improvements in methodology. With contributions from leading econometricians, this collection will be essential reading for researchers and graduate students concerned with the use of asymptotic methods in econometric analysis.
This book provides practical, research-based advice on how to conduct high-quality stated choice studies. It covers every aspect of the topic, from planning and writing the survey, to analyzing results, to evaluating quality. There is no other book on the market today that so thoroughly addresses the methodology of stated choice. Chapters are written by top-notch academics and practitioners in an accessible style, offering practical, tough advice.
One cannot exaggerate the importance of estimating how
international trade responds to changes in income and prices. But
there is a tension between whether one should use models that fit
the data but that contradict certain aspects of the underlying
theory or models that fit the theory but contradict certain aspects
of the data. The essays in Estimating Trade Elasticities book offer
one practical approach to deal with this tension. The analysis
starts with the practical implications of optimising behaviour for
estimation and it follows with a re-examination of the puzzling
income elasticity for US imports that three decades of studies have
not resolved. The analysis then turns to the study of the role of
income and prices in determining the expansion in Asian trade, a
study largely neglected in fifty years of research. With the new
estimates of trade elasticities, the book examines how they assist
in restoring the consistency between elasticity estimates and the
world trade identity.
Jean-Jacques Rousseau wrote in the Preface to his famous Discourse on Inequality that "I consider the subject of the following discourse as one of the most interesting questions philosophy can propose, and unhappily for us, one of the most thorny that philosophers can have to solve. For how shall we know the source of inequality between men, if we do not begin by knowing mankind?" (Rousseau, 1754). This citation of Rousseau appears in an article in Spanish where Dagum (2001), in the memory of whom this book is published, also cites Socrates who said that the only useful knowledge is that which makes us better and Seneca who wrote that knowing what a straight line is, is not important if we do not know what rectitude is. These references are indeed a good illustration of Dagum's vast knowledge, which was clearly not limited to the ?eld of Economics. For Camilo the ?rst part of Rousseau's citation certainly justi?ed his interest in the ?eld of inequality which was at the centre of his scienti?c preoccupations. It should however be stressed that for Camilo the second part of the citation represented a "solid argument in favor of giving macroeconomic foundations to microeconomic behavior" (Dagum, 2001). More precisely, "individualism and methodological holism complete each other in contributing to the explanation of individual and social behavior" (Dagum, 2001).
A new approach to explaining the existence of firms and markets, focusing on variability and coordination. It stands in contrast to the emphasis on transaction costs, and on monitoring and incentive structures, which are prominent in most of the modern literature in this field. This approach, called the variability approach, allows us to: show why both the need for communication and the coordination costs increase when the division of labor increases; explain why, while the firm relies on direction, the market does not; rigorously formulate the optimum divisionalization problem; better understand the relationship between technology and organization; show why the size' of the firm is limited; and to refine the analysis of whether the existence of a sharable input, or the presence of an external effect leads to the emergence of a firm. The book provides a wealth of insights for students and professionals in economics, business, law and organization.
This text prepares first-year graduate students and advanced undergraduates for empirical research in economics, and also equips them for specialization in econometric theory, business, and sociology. "A Course in Econometrics" is likely to be the text most thoroughly attuned to the needs of your students. Derived from the course taught by Arthur S. Goldberger at the University of Wisconsin-Madison and at Stanford University, it is specifically designed for use over two semesters, offers students the most thorough grounding in introductory statistical inference, and offers a substantial amount of interpretive material. The text brims with insights, strikes a balance between rigor and intuition, and provokes students to form their own critical opinions. "A Course in Econometrics" thoroughly covers the fundamentals--classical regression and simultaneous equations--and offers clear and logical explorations of asymptotic theory and nonlinear regression. To accommodate students with various levels of preparation, the text opens with a thorough review of statistical concepts and methods, then proceeds to the regression model and its variants. Bold subheadings introduce and highlight key concepts throughout each chapter. Each chapter concludes with a set of exercises specifically designed to reinforce and extend the material covered. Many of the exercises include real micro-data analyses, and all are ideally suited to use as homework and test questions.
This is the second of three volumes containing edited versions of papers and a commentary presented at invited symposium sessions of the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarize and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
This is the second of three volumes containing edited versions of papers and a commentary presented at invited symposium sessions of the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarize and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
The purpose of models is not to fit the data but to sharpen the questions. S. Karlin, 11th R. A. Fisher Memorial Lecture, Royal Society, 20 April 1983 We are proud to offer this volume in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Not one to model solely for the sake of modelling, the above quotation nicely captures Professor Paelinck's unceasing quest for the best question for which an answer is needed. His FLEUR model has sharpened many spatial economics and spatial econometrics questions! Jean Paelinck, arguably, is the founder of modem spatial econometrics, penning the seminal introductory monograph on this topic, Spatial Econometrics, with Klaassen in 1979. In the General Address to the Dutch Statistical Association, on May 2, 1974, in Tilburg, "he coined the term [spatial econometrics] to designate a growing body of the regional science literature that dealt primarily with estimation and testing problems encountered in the implementation of multiregional econometric models" (Anselin, 1988, p. 7); he already had introduced this idea in his introductory report to the 1966 Annual Meeting of the Association de Science Regionale de Langue Fran~aise.
Globalization affects regional economies in a broad spectrum of aspects, from labor market conditions and development policies to climate change. This volume, written by an international cast of eminent regional scientists, provides new tools for analyzing the enormous changes in regional economies due to globalization. It offers timely conceptual refinements for regional analysis.
Nonlinear Econometric Modeling in Time Series presents the more recent literature on nonlinear time series. Specific topics covered with respect to nonlinearity include cointegration tests, risk-related asymmetries, structural breaks and outliers, Bayesian analysis with a threshold, consistency and asymptotic normality, asymptotic inference and error-correction models. With a world-class panel of contributors, this volume addresses topics with major applications for fields such as foreign-exchange markets and interest rate analysis. Eleventh in this series of international symposia, this volume is also part of the European Conference Series in Quantitative Economics and Econometrics (EC)2.
The aim of the book is to provide an overview of risk management in life insurance companies. The focus is twofold: (1) to provide a broad view of the different topics needed for risk management and (2) to provide the necessary tools and techniques to concretely apply them in practice. Much emphasis has been put into the presentation of the book so that it presents the theory in a simple but sound manner. The first chapters deal with valuation concepts which are defined and analysed, the emphasis is on understanding the risks in corresponding assets and liabilities such as bonds, shares and also insurance liabilities. In the following chapters risk appetite and key insurance processes and their risks are presented and analysed. This more general treatment is followed by chapters describing asset risks, insurance risks and operational risks - the application of models and reporting of the corresponding risks is central. Next, the risks of insurance companies and of special insurance products are looked at. The aim is to show the intrinsic risks in some particular products and the way they can be analysed. The book finishes with emerging risks and risk management from a regulatory point of view, the standard model of Solvency II and the Swiss Solvency Test are analysed and explained. The book has several mathematical appendices which deal with the basic mathematical tools, e.g. probability theory, stochastic processes, Markov chains and a stochastic life insurance model based on Markov chains. Moreover, the appendices look at the mathematical formulation of abstract valuation concepts such as replicating portfolios, state space deflators, arbitrage free pricing and the valuation of unit linked products with guarantees. The various concepts in the book are supported by tables and figures.
E. Dijkgraaf and R. H. J. M. Gradus 1. 1 Introduction In 2004 Elbert Dijkgraaf nished a PhD-thesis 'Regulating the Dutch waste market' at the Erasmus University Rotterdam. It was interesting that not much is published about the waste market, although it is a very important sector from an economic and environmental viewpoint. In 2006 we were participants at a very interesting conf- ence on Local Government Reform: privatization and public-private collaboration in Barcelona organized by Germa ` Bel. It was interesting to notice that researchers from Spain, Scandinavian countries, the UK and the USA were studying this issue as well. From this we brought forward the idea to publish a book about the waste market. Because of its legal framework we want to focus on Europe. In this chapter we give an introduction to this book. In the next paragraph we present a short overview of the waste collection market. Since 1960 the importance of the waste sector has increased substantially both in the waste streams and the costs of waste collection and treatment. Furthermore, we discuss policy measures to deal with these increases and give an overview of the different measures in - countries. In the last paragraph we present different chapters of our book. 1. 2 Empirical Update of the Waste Collection Market The Dutch case provides a nice example why studying the waste market is int- esting from an economic point of view.
Spatial Econometrics is a rapidly evolving field born from the joint efforts of economists, statisticians, econometricians and regional scientists. The book provides the reader with a broad view of the topic by including both methodological and application papers. Indeed the application papers relate to a number of diverse scientific fields ranging from hedonic models of house pricing to demography, from health care to regional economics, from the analysis of R&D spillovers to the study of retail market spatial characteristics. Particular emphasis is given to regional economic applications of spatial econometrics methods with a number of contributions specifically focused on the spatial concentration of economic activities and agglomeration, regional paths of economic growth, regional convergence of income and productivity and the evolution of regional employment. Most of the papers appearing in this book were solicited from the International Workshop on Spatial Econometrics and Statistics held in Rome (Italy) in 2006.
In recent years there has been a growing interest in and concern for the development of a sound spatial statistical body of theory. This work has been undertaken by geographers, statisticians, regional scientists, econometricians, and others (e. g., sociologists). It has led to the publication of a number of books, including Cliff and Ord's Spatial Processes (1981), Bartlett's The Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial Statistics (1981), Paelinck and Klaassen's Spatial Economet ics (1979), Ahuja and Schachter's Pattern Models (1983), and Upton and Fingleton's Spatial Data Analysis by Example (1985). The first of these books presents a useful introduction to the topic of spatial autocorrelation, focusing on autocorrelation indices and their sampling distributions. The second of these books is quite brief, but nevertheless furnishes an eloquent introduction to the rela tionship between spatial autoregressive and two-dimensional spectral models. Ripley's book virtually ignores autoregressive and trend surface modelling, and focuses almost solely on point pattern analysis. Paelinck and Klaassen's book closely follows an econometric textbook format, and as a result overlooks much of the important material necessary for successful spatial data analy sis. It almost exclusively addresses distance and gravity models, with some treatment of autoregressive modelling. Pattern Models supplements Cliff and Ord's book, which in combination provide a good introduction to spatial data analysis. Its basic limitation is a preoccupation with the geometry of planar patterns, and hence is very narrow in scope."
This Festschrift is dedicated to Goetz Trenkler on the occasion of his 65th birthday. As can be seen from the long list of contributions, Goetz has had and still has an enormous range of interests, and colleagues to share these interests with. He is a leading expert in linear models with a particular focus on matrix algebra in its relation to statistics. He has published in almost all major statistics and matrix theory journals. His research activities also include other areas (like nonparametrics, statistics and sports, combination of forecasts and magic squares, just to mention afew). Goetz Trenkler was born in Dresden in 1943. After his school years in East G- many and West-Berlin, he obtained a Diploma in Mathematics from Free University of Berlin (1970), where he also discovered his interest in Mathematical Statistics. In 1973, he completed his Ph.D. with a thesis titled: On a distance-generating fu- tion of probability measures. He then moved on to the University of Hannover to become Lecturer and to write a habilitation-thesis (submitted 1979) on alternatives to the Ordinary Least Squares estimator in the Linear Regression Model, a topic that would become his predominant ?eld of research in the years to come.
Complex dynamics constitute a growing and increasingly important area as they offer a strong potential to explain and formalize natural, physical, financial and economic phenomena. This book pursues the ambitious goal to bring together an extensive body of knowledge regarding complex dynamics from various academic disciplines. Beyond its focus on economics and finance, including for instance the evolution of macroeconomic growth models towards nonlinear structures as well as signal processing applications to stock markets, fundamental parts of the book are devoted to the use of nonlinear dynamics in mathematics, statistics, signal theory and processing. Numerous examples and applications, almost 700 illustrations and numerical simulations based on the use of Matlab make the book an essential reference for researchers and students from many different disciplines who are interested in the nonlinear field. An appendix recapitulates the basic mathematical concepts required to use the book.
In this book, Professor Thomson and Professor Lensberg extrapolate upon the Nash (1950) treatment of the bargaining problem to consider the situation where the number of bargainers may vary. The authors formulate axioms to specify how solutions should respond to such changes, and provide new characterizations of all the major solutions as well as generalizations of these solutions. The book also contains several other comparative studies of solutions in the context of a variable number of agents. Much of the theory of bargaining can be rewritten within this context. The pre-eminence of the three solutions at the core of the classical theory is confirmed. These are the solutions introducted by Nash (1950) and two solutions axiomatized in the 1970s (Kalai-Smorodinsky and egalitarian solutions).
Many optimization questions arise in economics and finance; an important example of this is the society's choice of the optimum state of the economy (the social choice problem). Optimization in Economics and Finance extends and improves the usual optimization techniques, in a form that may be adopted for modeling social choice problems. Problems discussed include: when is an optimum reached; when is it unique; relaxation of the conventional convex (or concave) assumptions on an economic model; associated mathematical concepts such as invex and quasimax; multiobjective optimal control models; and related computational methods and programs. These techniques are applied to economic growth models (including small stochastic perturbations), finance and financial investment models (and the interaction between financial and production variables), modeling sustainability over long time horizons, boundary (transversality) conditions, and models with several conflicting objectives. Although the applications are general and illustrative, the models in this book provide examples of possible models for a society's social choice for an allocation that maximizes welfare and utilization of resources. As well as using existing computer programs for optimization of models, a new computer program, named SCOM, is presented in this book for computing social choice models by optimal control.
WINNER OF THE 2007 DEGROOT PRIZE The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models. For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods. It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach. The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
"Introductory Econometrics: Intuition, Proof, and Practice"
attempts to distill econometrics into a form that preserves its
essence, but that is acceptable--and even appealing--to the
student's intellectual palate. This book insists on rigor when it
is essential, but it emphasizes intuition and seizes upon
entertainment wherever possible.
This book provides an introduction to the technical background of
unit root testing, one of the most heavily researched areas in
econometrics over the last twenty years. Starting from an
elementary understanding of probability and time series, it
develops the key concepts necessary to understand the structure of
random walks and brownian motion, and their role in tests for a
unit root. The techniques are illustrated with worked examples,
data and programs available on the book's website, which includes
more numerical and theoretical examples
The approach to many problems in economic analysis has changed drastically with the development and dissemination of new and more efficient computational techniques. Computational Economic Systems: Models, Methods & Econometrics presents a selection of papers illustrating the use of new computational methods and computing techniques to solve economic problems. Part I of the volume consists of papers which focus on modelling economic systems, presenting computational methods to investigate the evolution of behavior of economic agents, techniques to solve complex inventory models on a parallel computer and an original approach for the construction and solution of multicriteria models involving logical conditions. Contributions to Part II concern new computational approaches to economic problems. We find an application of wavelets to outlier detection. New estimation algorithms are presented, one concerning seemingly related regression models, a second one on nonlinear rational expectation models and a third one dealing with switching GARCH estimation. Three contributions contain original approaches for the solution of nonlinear rational expectation models.
Survival analysis is a highly active area of research with applications spanning the physical, engineering, biological, and social sciences. In addition to statisticians and biostatisticians, researchers in this area include epidemiologists, reliability engineers, demographers and economists. The economists survival analysis by the name of duration analysis and the analysis of transition data. We attempted to bring together leading researchers, with a common interest in developing methodology in survival analysis, at the NATO Advanced Research Workshop. The research works collected in this volume are based on the presentations at the Workshop. Analysis of survival experiments is complicated by issues of censoring, where only partial observation of an individual's life length is available and left truncation, where individuals enter the study group if their life lengths exceed a given threshold time. Application of the theory of counting processes to survival analysis, as developed by the Scandinavian School, has allowed for substantial advances in the procedures for analyzing such experiments. The increased use of computer intensive solutions to inference problems in survival analysis~ in both the classical and Bayesian settings, is also evident throughout the volume. Several areas of research have received special attention in the volume. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Tools and Techniques for Economic…
Jelena Stankovi, Pavlos Delias, …
Hardcover
R5,167
Discovery Miles 51 670
|