Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This book covers a highly relevant and timely topic that is of wide interest, especially in finance, engineering and computational biology. The introductory material on simulation and stochastic differential equation is very accessible and will prove popular with many readers. While there are several recent texts available that cover stochastic differential equations, the concentration here on inference makes this book stand out. No other direct competitors are known to date. With an emphasis on the practical implementation of the simulation and estimation methods presented, the text will be useful to practitioners and students with minimal mathematical background. What's more, because of the many R programs, the information here is appropriate for many mathematically well educated practitioners, too.
International Applications of Productivity and Efficiency Analysis features a complete range of techniques utilized in frontier analysis, including extensions of existing techniques and the development of new techniques. Another feature is that most of the contributions use panel data in a variety of approaches. Finally, the range of empirical applications is at least as great as the range of techniques, and many of the applications are of considerable policy relevance.
Economists and psychologists have, on the whole, exhibited sharply different perspectives on the elicitation of preferences. Economists, who have made preference the central primitive in their thinking about human behavior, have for the most part rejected elicitation and have instead sought to infer preferences from observations of choice behavior. Psychologists, who have tended to think of preference as a context-determined subjective construct, have embraced elicitation as their dominant approach to measurement. This volume, based on a symposium organized by Daniel McFadden at the University of California at Berkeley, provides a provocative and constructive engagement between economists and psychologists on the elicitation of preferences.
This book is the result of my doctoral dissertation research at the Department of Econometrics of the University of Geneva, Switzerland. This research was also partially financed by the Swiss National Science Foundation (grants 12- 31072.91 and 12-40300.94). First and foremost, I wish to express my deepest gratitude to Professor Manfred Gilli, my thesis supervisor, for his constant support and help. I would also like to thank the president of my jury, Professor Fabrizio Carlevaro, as well as the other members of the jury, Professor Andrew Hughes Hallett, Professor Jean-Philippe Vial and Professor Gerhard Wanner. I am grateful to my colleagues and friends of the Departement of Econometrics, especially David Miceli who provided constant help and kind understanding during all the stages of my research. I would also like to thank Pascale Mignon for proofreading my text and im proving my English. Finally, I am greatly indebted to my parents for their kindness and encourage ments without which I could never have achieved my goals. Giorgio Pauletto Department of Econometrics, University of Geneva, Geneva, Switzerland Chapter 1 Introduction The purpose of this book is to present the available methodologies for the solution of large-scale macroeconometric models. This work reviews classical solution methods and introduces more recent techniques, such as parallel com puting and nonstationary iterative algorithms."
Covers applications to risky assets traded on the markets for
funds, fixed-income products and electricity derivatives.
Given the magnitude of currency speculation and sports gambling, it is surprising that the literature contains mostly negative forecasting results. Majority opinion still holds that short term fluctuations in financial markets follow random walk. In this non-random walk through financial and sports gambling markets, parallels are drawn between modeling short term currency movements and modeling outcomes of athletic encounters. The forecasting concepts and methodologies are identical; only the variables change names. If, in fact, these markets are driven by mechanisms of non-random walk, there must be some explanation for the negative forecasting results. The Analysis of Sports Forecasting: Modeling Parallels Between Sports Gambling and Financial Markets examines this issue.
Scientific visualization may be defined as the transformation of numerical scientific data into informative graphical displays. The text introduces a nonverbal model to subdisciplines that until now has mostly employed mathematical or verbal-conceptual models. The focus is on how scientific visualization can help revolutionize the manner in which the tendencies for (dis)similar numerical values to cluster together in location on a map are explored and analyzed. In doing so, the concept known as spatial autocorrelation - which characterizes these tendencies - is further demystified.
The generous social welfare system in Europe is one of the most important differences between Europe and the US. Defenders of the European welfare state argue that it improves social cohesion and prevents crime. Others argue that the "invisible hand" in the US economy is equally powerful in reducing unemployment and preventing crime. This book takes this trade-off as a starting point and contributes to a better interdisciplinary understanding of the interactions between crime, economic performance and social exclusion. In doing so, it evaluates the existing economic and criminological research and provides innovative empirical investigations on the basis of international panel data sets from different levels of regional aggregation.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
This monograph provides a unified and comprehensive treatment of an order-theoretic fixed point theory in partially ordered sets and its various useful interactions with topological structures. The material progresses systematically, by presenting the preliminaries before moving to more advanced topics. In the treatment of the applications a wide range of mathematical theories and methods from nonlinear analysis and integration theory are applied; an outline of which has been given an appendix chapter to make the book self-contained. Graduate students and researchers in nonlinear analysis, pure and applied mathematics, game theory and mathematical economics will find this book useful.
Due to the ability to handle specific characteristics of economics and finance forecasting problems like e.g. non-linear relationships, behavioral changes, or knowledge-based domain segmentation, we have recently witnessed a phenomenal growth of the application of computational intelligence methodologies in this field. In this volume, Chen and Wang collected not just works on traditional computational intelligence approaches like fuzzy logic, neural networks, and genetic algorithms, but also examples for more recent technologies like e.g. rough sets, support vector machines, wavelets, or ant algorithms. After an introductory chapter with a structural description of all the methodologies, the subsequent parts describe novel applications of these to typical economics and finance problems like business forecasting, currency crisis discrimination, foreign exchange markets, or stock markets behavior.
Industrial Price, Quantity, and Productivity Indices: The Micro-Economic Theory and an Application gives a comprehensive account of the micro-economic foundations of industrial price, quantity, and productivity indices. The various results available from the literature have been brought together into a consistent framework, based upon modern duality theory. This integration also made it possible to generalize several of these results. Thus, this book will be an important resource for theoretically as well as empirically-oriented researchers who seek to analyse economic problems with the help of index numbers. Although this book's emphasis is on micro-economic theory, it is also intended as a practical guide. A full chapter is therefore devoted to an empirical application. Three different approaches are pursued: a straightforward empirical approach, a non-parametric estimation approach, and a parametric estimation approach. As well as illustrating some of the more important concepts explored in this book, and showing to what extent different computational approaches lead to different outcomes for the same measures, this chapter also makes a powerful case for the use of enterprise micro-data in economic research.
Data envelopment analysis develops a set of nonparametric and semiparametric techniques for measuring economic efficiency among firms and nonprofit organizations. Over the past decade this technique has found most widespread applications in public sector organizations. However these applications have been mostly static. This monograph extends this static framework of efficiency analysis in several new directions. These include but are not limited to the following: (1) a dynamic view of the production and cost frontier, where capital inputs are treated differently from the current inputs, (2) a direct role of the technological progress and regress, which is so often stressed in total factor productivity discussion in modem growth theory in economics, (3) stochastic efficiency in a dynamic setting, where reliability improvement competes with technical efficiency, (4) flexible manufacturing systems, where flexibility of the production process and the economies of scope play an important role in efficiency analysis and (5) the role of economic factors such as externalities and input interdependences. Efficiency is viewed here in the framework of a general systems theory model. Such a view is intended to broaden the scope of applications of this promising new technique of data envelopment analysis. The monograph stresses the various applied aspects of the dynamic theory, so that it can be empirically implemented in different situations. As far as possible abstract mathematical treatments are avoided and emphasis placed on the statistical examples and empirical illustrations.
Coalition Formation and Social Choice provides a unified and comprehensive study of coalition formation and collective decision-making in committees. It discusses the main existing theories including the size principle, conflict of interest theory, dominant player theory, policy distance theory and power excess theory. In addition, the book offers new theories of coalition formation in which the endogenous formation of preferences for coalitions is basic. Both simple game theory and social choice theory are extensively applied in the treatment of the theories. This combined application not only leads to new theories but also offers a new and fresh perspective on coalition formation and collective decision-making in committees. The book covers the fundamental concepts and results of social choice theory including Arrow's Impossibility Theorem. Furthermore, it gives a coherent treatment of the theory of simple games. Besides more traditional topics in simple game theory like power indices, it also introduces new aspects of simple games such as the Chow parameter, the Chow vector and the notion of similar games.
This book provides practical, research-based advice on how to conduct high-quality stated choice studies. It covers every aspect of the topic, from planning and writing the survey, to analyzing results, to evaluating quality. There is no other book on the market today that so thoroughly addresses the methodology of stated choice. Chapters are written by top-notch academics and practitioners in an accessible style, offering practical, tough advice.
Gerald P. Dwyer, Jr. and R. W. Hafer The articles and commentaries included in this volume were presented at the Federal Reserve Bank of St. Louis' thirteenth annual economic policy conference, held on October 21-22, 1988. The conference focused on the behavior of asset market prices, a topic of increasing interest to both the popular press and to academic journals as the bull market of the 1980s continued. The events that transpired during October, 1987, both in the United States and abroad, provide an informative setting to test alter native theories. In assembling the papers presented during this conference, we asked the authors to explore the issue of asset pricing and financial market behavior from several vantages. Was the crash evidence of the bursting of a speculative bubble? Do we know enough about the work ings of asset markets to hazard an intelligent guess why they dropped so dramatically in such a brief time? Do we know enough to propose regulatory changes that will prevent any such occurrence in the future, or do we want to even if we can? We think that the articles and commentaries contained in this volume provide significant insight to inform and to answer such questions. The article by Behzad Diba surveys existing theoretical and empirical research on rational bubbles in asset prices."
Back in the good old days on the fourth floor of the Altbau of Bonn's Ju ridicum, Werner Hildenbrand put an end to a debate about a festschrift in honor of an economist on the occasion of his turning 60 with a laconic: "Much too early." Remembering his position five years ago, we did not dare to think about one for him. But now he has turned 65. If consulted, he would most likely still answer: "Much too early." However, he has to take his official re tirement, and we believe that this is the right moment for such an endeavor. No doubt Werner Hildenbrand will not really retire. As professor emeritus, free from the constraints of a rigid teaching schedule and the burden of com mittee meetings, he will be able to indulge his passions. We expect him to pursue, with undiminished enthusiasm, his research, travel, golfing, the arts, and culinary pleasures - escaping real retirement."
A non-technical introduction to the question of modeling with time-varying parameters, using the beta coefficient from Financial Economics as the main example. After a brief introduction to this coefficient for those not versed in finance, the book presents a number of rather well known tests for constant coefficients and then performs these tests on data from the Stockholm Exchange. The Kalman filter is then introduced and a simple example is used to demonstrate the power of the filter. The filter is then used to estimate the market model with time-varying betas. The book concludes with further examples of how the Kalman filter may be used in estimation models used in analyzing other aspects of finance. Since both the programs and the data used in the book are available for downloading, the book is especially valuable for students and other researchers interested in learning the art of modeling with time varying coefficients.
After Karl Joreskog's first presentation in 1970, Structural Equation Modelling or SEM has become a main statistical tool in many fields of science. It is the standard approach of factor analytic and causal modelling in such diverse fields as sociology, education, psychology, economics, management and medical sciences. In addition to an extension of its application area, Structural Equation Modelling also features a continual renewal and extension of its theoretical background. The sixteen contributions to this book, written by experts from many countries, present important new developments and interesting applications in Structural Equation Modelling. The book addresses methodologists and statisticians professionally dealing with Structural Equation Modelling to enhance their knowledge of the type of models covered and the technical problems involved in their formulation. In addition, the book offers applied researchers new ideas about the use of Structural Equation Modeling in solving their problems. Finally, methodologists, mathematicians and applied researchers alike are addressed, who simply want to update their knowledge of recent approaches in data analysis and mathematical modelling."
This book covers recent advances in efficiency evaluations, most notably Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) methods. It introduces the underlying theories, shows how to make the relevant calculations and discusses applications. The aim is to make the reader aware of the pros and cons of the different methods and to show how to use these methods in both standard and non-standard cases. Several software packages have been developed to solve some of the most common DEA and SFA models. This book relies on R, a free, open source software environment for statistical computing and graphics. This enables the reader to solve not only standard problems, but also many other problem variants. Using R, one can focus on understanding the context and developing a good model. One is not restricted to predefined model variants and to a one-size-fits-all approach. To facilitate the use of R, the authors have developed an R package called Benchmarking, which implements the main methods within both DEA and SFA. The book uses mathematical formulations of models and assumptions, but it de-emphasizes the formal proofs - in part by placing them in appendices -- or by referring to the original sources. Moreover, the book emphasizes the usage of the theories and the interpretations of the mathematical formulations. It includes a series of small examples, graphical illustrations, simple extensions and questions to think about. Also, it combines the formal models with less formal economic and organizational thinking. Last but not least it discusses some larger applications with significant practical impacts, including the design of benchmarking-based regulations of energy companies in different European countries, and the development of merger control programs for competition authorities.
This book focuses on tools and techniques for building regression models using real-world data and assessing their validity. A key theme throughout the book is that it makes sense to base inferences or conclusions only on valid models. Plots are shown to be an important tool for both building regression models and assessing their validity. We shall see that deciding what to plot and how each plot should be interpreted will be a major challenge. In order to overcome this challenge we shall need to understand the mathematical properties of the fitted regression models and associated diagnostic procedures. As such this will be an area of focus throughout the book. In particular, we shall carefully study the properties of resi- als in order to understand when patterns in residual plots provide direct information about model misspecification and when they do not. The regression output and plots that appear throughout the book have been gen- ated using R. The output from R that appears in this book has been edited in minor ways. On the book web site you will find the R code used in each example in the text.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
The purpose of models is not to fit the data but to sharpen the questions. S. Karlin, 11th R. A. Fisher Memorial Lecture, Royal Society, 20 April 1983 We are proud to offer this volume in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Not one to model solely for the sake of modelling, the above quotation nicely captures Professor Paelinck's unceasing quest for the best question for which an answer is needed. His FLEUR model has sharpened many spatial economics and spatial econometrics questions! Jean Paelinck, arguably, is the founder of modem spatial econometrics, penning the seminal introductory monograph on this topic, Spatial Econometrics, with Klaassen in 1979. In the General Address to the Dutch Statistical Association, on May 2, 1974, in Tilburg, "he coined the term [spatial econometrics] to designate a growing body of the regional science literature that dealt primarily with estimation and testing problems encountered in the implementation of multiregional econometric models" (Anselin, 1988, p. 7); he already had introduced this idea in his introductory report to the 1966 Annual Meeting of the Association de Science Regionale de Langue Fran~aise.
This book presents models and statistical methods for the analysis of recurrent event data. The authors provide broad, detailed coverage of the major approaches to analysis, while emphasizing the modeling assumptions that they are based on. More general intensity-based models are also considered, as well as simpler models that focus on rate or mean functions. Parametric, nonparametric and semiparametric methodologies are all covered, with procedures for estimation, testing and model checking. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,672
Discovery Miles 26 720
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,513
Discovery Miles 65 130
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,998
Discovery Miles 79 980
Advanced Introduction to Spatial…
Daniel A. Griffith, Bin Li
Hardcover
R2,745
Discovery Miles 27 450
|