![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Income Elasticity and Economic Development Methods and Applications is mainly concerned with methods of estimating income elasticity. This field is connected with economic development that can be achieved by reducing income inequality. This is highly relevant in today's world, where the gap between rich and poor is widening with the growth of economic development. Income Elasticity and Economic Development Methods and Applications provides a good example in showing how to calculate income elasticity, using a number of methods from widely available grouped data. Some of the techniques presented here can be used in a wide range of policy areas in all developed, developing and under-developed countries. Policy analysts, economists, business analysts and market researchers will find this book very useful.
Charles de Gaulle commence ses Memoires d'Espoir, ainsi: 'La France vient du fond des ages. Elle vito Les Siecles l'appellent. Mais elle demeure elle-meme au long du temps. Ses limites peuvent se modifier sans que changent Ie relief, Ie climat, les fleuves, les mers, qui la marquent indefmiment. Y habitent des peuples qu'etreignent, au cours de l'Histoire, les epreuves les plus diverses, mais que la nature des choses, utilisee par la politique, petrit sans cesse en une seule nation. Celle-ci a embrasse de nombreuses generations. Elle en comprend actuellement plusieurs. Elle en enfantera beaucoup d'autres. Mais, de par la geograpbie du pays qui est Ie sien, de par Ie genie des races qui la composent, de par les voisinages qui l'entourent, elle revet un caractere constant qui fait dependre de leurs peres les Fran ais de chaque epoque et les engage pour leurs descendants. A moins de se rompre, cet ensemble humain, sur ce territoire, au sein de cet univers, comporte donc un passe, un present, un avenir, indissolubles. Aussi l'ttat, qui repond de la France, est-il en charge, a la fois, de son heritage d'bier, de ses interets d'aujourd'hui et de ses espoirs de demain. ' A la lurniere de cette idee de nation, il est clair, qu'un dialogue entre nations est eminemment important et que la Semaine Universitaire Franco Neerlandaise est une institution pour stimuler ce dialogue."
The econometric consequences of nonstationary data have wide ranging im plications for empirical research in economics. Specifically, these issues have implications for the study of empirical relations such as a money demand func tion that links macroeconomic aggregates: real money balances, real income and a nominal interest rate. Traditional monetary theory predicts that these nonsta tionary series form a cointegrating relation and accordingly, that the dynamics of a vector process comprised of these variables generates distinct patterns. Re cent econometric developments designed to cope with nonstationarities have changed the course of empirical research in the area, but many fundamental challenges, for example the issue of identification, remain. This book represents the efforts undertaken by the authors in recent years in an effort to determine the consequences that nonstationarity has for the study of aggregate money demand relations. We have brought together an empirical methodology that we find useful in conducting empirical research. Some of the work was undertaken during the authors' sabbatical periods and we wish to acknowledge the generous support of Arizona State University and Michigan State University respectively. Professor Hoffman wishes to acknowledge the support of the Fulbright-Hays Foundation that supported sabbattical research in Europe and separate support of the Council of 100 Summer Research Program at Arizona State University."
Lawrence Klein, University of Pennsylvania Jaime Marquez, Federal Reserve BoarrI* All examination of the economics literature over the last twenty years reveals a marked tendency towards polarisation. On the one hand, there has been a propensity to develop theoretical models which have little connection with either empirical verification or problems requiring immediate attention. On the other iland, empirical analyses are generally typified by testing for its own sake, with limited examination of the implications of the results. As a result, the number of papers confronting theory with facts towards the solution of economic problems has been on the decline for years. To fill this growing gap in the literature, we have invited a number of authors to write papers using both theoretical and empirical techniques to address current issues of interest to the profession at large: the US trade deficit and the global implications of policies that attempt to reduce it, the international ramifications of the debt crisis, the international oil market and its implications for the US oil industry, and the development of new econometric techniques. In addressing these issues, each author has approached the subject matter from an eclectic standpoint - that is, avoiding strict adherence to a given doctrine.
This book presents the state of the art in multilevel analysis, with an emphasis on more advanced topics. These topics are discussed conceptually, analyzed mathematically, and illustrated by empirical examples. Multilevel analysis is the statistical analysis of hierarchically and non-hierarchically nested data. The simplest example is clustered data, such as a sample of students clustered within schools. Multilevel data are especially prevalent in the social and behavioral sciences and in the biomedical sciences. The chapter authors are all leading experts in the field. Given the omnipresence of multilevel data in the social, behavioral, and biomedical sciences, this book is essential for empirical researchers in these fields.
In a relatively short period of time, data envelopment analysis (DEA) has grown into a powerful analytical tool for measuring and evaluating performance. DEA is computational at its core and this book is one of several Springer aim to publish on the subject. This work deals with the micro aspects of handling and modeling data issues in DEA problems. It is a handbook treatment dealing with specific data problems, including imprecise data and undesirable outputs.
This volume contains a selection of papers presented at the first conference of the Society for Computational Economics held at ICC Institute, Austin, Texas, May 21-24, 1995. Twenty-two papers are included in this volume, devoted to applications of computational methods for the empirical analysis of economic and financial systems; the development of computing methodology, including software, related to economics and finance; and the overall impact of developments in computing. The various contributions represented in the volume indicate the growing interest in the topic due to the increased availability of computational concepts and tools and the necessity of analyzing complex decision problems. The papers in this volume are divided into four sections: Computational methods in econometrics, Computational methods in finance, Computational methods for a social environment and New computational methods.GBP/LISTGBP
This book proposes a uniform logic and probabilistic (LP) approach to risk estimation and analysis in engineering and economics. It covers the methodological and theoretical basis of risk management at the design, test, and operation stages of economic, banking, and engineering systems with groups of incompatible events (GIE). This edition includes new chapters providing a detailed treatment of scenario logic and probabilistic models for revealing bribes. It also contains clear definitions and notations, revised sections and chapters, an extended list of references, and a new subject index, as well as more than a hundred illustrations and tables which motivate the presentation.
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
Scientific visualization may be defined as the transformation of numerical scientific data into informative graphical displays. The text introduces a nonverbal model to subdisciplines that until now has mostly employed mathematical or verbal-conceptual models. The focus is on how scientific visualization can help revolutionize the manner in which the tendencies for (dis)similar numerical values to cluster together in location on a map are explored and analyzed. In doing so, the concept known as spatial autocorrelation - which characterizes these tendencies - is further demystified.
Due to the ability to handle specific characteristics of economics and finance forecasting problems like e.g. non-linear relationships, behavioral changes, or knowledge-based domain segmentation, we have recently witnessed a phenomenal growth of the application of computational intelligence methodologies in this field. In this volume, Chen and Wang collected not just works on traditional computational intelligence approaches like fuzzy logic, neural networks, and genetic algorithms, but also examples for more recent technologies like e.g. rough sets, support vector machines, wavelets, or ant algorithms. After an introductory chapter with a structural description of all the methodologies, the subsequent parts describe novel applications of these to typical economics and finance problems like business forecasting, currency crisis discrimination, foreign exchange markets, or stock markets behavior.
Simulation methods are revolutionizing the practice of applied economic analysis. This volume collects eighteen chapters written by leading researchers from prestigious research institutions the world over. The common denominator of the papers is their relevance for applied research in environmental and resource economics. The topics range from discrete choice modeling with heterogeneity of preferences, to Bayesian estimation, to Monte Carlo experiments, to structural estimation of Kuhn-Tucker demand systems, to evaluation of simulation noise in maximum simulated likelihood estimates, to dynamic natural resource modeling. Empirical cases are used to show the practical use and the results brought forth by the different methods.
A non-technical introduction to the question of modeling with time-varying parameters, using the beta coefficient from Financial Economics as the main example. After a brief introduction to this coefficient for those not versed in finance, the book presents a number of rather well known tests for constant coefficients and then performs these tests on data from the Stockholm Exchange. The Kalman filter is then introduced and a simple example is used to demonstrate the power of the filter. The filter is then used to estimate the market model with time-varying betas. The book concludes with further examples of how the Kalman filter may be used in estimation models used in analyzing other aspects of finance. Since both the programs and the data used in the book are available for downloading, the book is especially valuable for students and other researchers interested in learning the art of modeling with time varying coefficients.
Spatial Econometrics is a rapidly evolving field born from the joint efforts of economists, statisticians, econometricians and regional scientists. The book provides the reader with a broad view of the topic by including both methodological and application papers. Indeed the application papers relate to a number of diverse scientific fields ranging from hedonic models of house pricing to demography, from health care to regional economics, from the analysis of R&D spillovers to the study of retail market spatial characteristics. Particular emphasis is given to regional economic applications of spatial econometrics methods with a number of contributions specifically focused on the spatial concentration of economic activities and agglomeration, regional paths of economic growth, regional convergence of income and productivity and the evolution of regional employment. Most of the papers appearing in this book were solicited from the International Workshop on Spatial Econometrics and Statistics held in Rome (Italy) in 2006.
E. Dijkgraaf and R. H. J. M. Gradus 1. 1 Introduction In 2004 Elbert Dijkgraaf nished a PhD-thesis 'Regulating the Dutch waste market' at the Erasmus University Rotterdam. It was interesting that not much is published about the waste market, although it is a very important sector from an economic and environmental viewpoint. In 2006 we were participants at a very interesting conf- ence on Local Government Reform: privatization and public-private collaboration in Barcelona organized by Germa ` Bel. It was interesting to notice that researchers from Spain, Scandinavian countries, the UK and the USA were studying this issue as well. From this we brought forward the idea to publish a book about the waste market. Because of its legal framework we want to focus on Europe. In this chapter we give an introduction to this book. In the next paragraph we present a short overview of the waste collection market. Since 1960 the importance of the waste sector has increased substantially both in the waste streams and the costs of waste collection and treatment. Furthermore, we discuss policy measures to deal with these increases and give an overview of the different measures in - countries. In the last paragraph we present different chapters of our book. 1. 2 Empirical Update of the Waste Collection Market The Dutch case provides a nice example why studying the waste market is int- esting from an economic point of view.
This second edition sees the light three years after the first one: too short a time to feel seriously concerned to redesign the entire book, but sufficient to be challenged by the prospect of sharpening our investigation on the working of econometric dynamic models and to be inclined to change the title of the new edition by dropping the "Topics in" of the former edition. After considerable soul searching we agreed to include several results related to topics already covered, as well as additional sections devoted to new and sophisticated techniques, which hinge mostly on the latest research work on linear matrix polynomials by the second author. This explains the growth of chapter one and the deeper insight into representation theorems in the last chapter of the book. The role of the second chapter is that of providing a bridge between the mathematical techniques in the backstage and the econometric profiles in the forefront of dynamic modelling. For this purpose, we decided to add a new section where the reader can find the stochastic rationale of vector autoregressive specifications in econometrics. The third (and last) chapter improves on that of the first edition by re- ing the fruits of the thorough analytic equipment previously drawn up."
This book examines non-Gaussian distributions. It addresses the causes and consequences of non-normality and time dependency in both asset returns and option prices. The book is written for non-mathematicians who want to model financial market prices so the emphasis throughout is on practice. There are abundant empirical illustrations of the models and techniques described, many of which could be equally applied to other financial time series.
Productivity growth is a keyword for sustainable economic growth in a knowledge-based society. There has been significant methodological development in the literature on productivity and efficiency analysis, e.g. SFA (Stochastic Frontier Analysis) and DEA (Data Envelopment Analysis). All these methodological developments should be matched with applications in order to provide practical implications for private and public decision-makers. This volume provides a collection of up-to-date and new applications of productivity and efficiency analysis. In particular, the case studies cover various economic issues in the Asia-Pacific region. The authors analyze the performance of manufacturing firms, banks, venture capital, broadcasting firms, as well as the issues of efficiency in the education sector, regional development, and defense industry. These case studies will shed light on the potential contribution of productivity and efficiency analysis to the enhancement of economic performance.
In this book leading German econometricians in different fields present survey articles of the most important new methods in econometrics. The book gives an overview of the field and it shows progress made in recent years and remaining problems.
This text provides a new approach to the subject, including a comprehensive survey of novel theoretical approaches, methods, and models used in macroeconomics and macroeconometrics. The book gives extensive insight into economic policy, incorporates a strong international perspective, and offers a broad historical perspective.
Jean-Jacques Rousseau wrote in the Preface to his famous Discourse on Inequality that "I consider the subject of the following discourse as one of the most interesting questions philosophy can propose, and unhappily for us, one of the most thorny that philosophers can have to solve. For how shall we know the source of inequality between men, if we do not begin by knowing mankind?" (Rousseau, 1754). This citation of Rousseau appears in an article in Spanish where Dagum (2001), in the memory of whom this book is published, also cites Socrates who said that the only useful knowledge is that which makes us better and Seneca who wrote that knowing what a straight line is, is not important if we do not know what rectitude is. These references are indeed a good illustration of Dagum's vast knowledge, which was clearly not limited to the ?eld of Economics. For Camilo the ?rst part of Rousseau's citation certainly justi?ed his interest in the ?eld of inequality which was at the centre of his scienti?c preoccupations. It should however be stressed that for Camilo the second part of the citation represented a "solid argument in favor of giving macroeconomic foundations to microeconomic behavior" (Dagum, 2001). More precisely, "individualism and methodological holism complete each other in contributing to the explanation of individual and social behavior" (Dagum, 2001).
Are foreign exchange markets efficient? Are fundamentals important for predicting exchange rate movements? What is the signal-to-ratio of high frequency exchange rate changes? Is it possible to define a measure of the equilibrium exchange rate that is useful from an assessment perspective? The book is a selective survey of current thinking on key topics in exchange rate economics, supplemented throughout by new empirical evidence. The focus is on the use of advanced econometric tools to find answers to these and other questions which are important to practitioners, policy-makers and academic economists. In addition, the book addresses more technical econometric considerations such as the importance of the choice between single-equation and system-wide approaches to modelling the exchange rate, and the reduced form versus structural equation problems. Readers will gain both a comprehensive overview of the way macroeconomists approach exchange rate modelling, and an understanding of how advanced techniques can help them explain and predict the behavior of this crucial economic variable.
Modern apparatuses allow us to collect samples of functional data, mainly curves but also images. On the other hand, nonparametric statistics produces useful tools for standard data exploration. This book links these two fields of modern statistics by explaining how functional data can be studied through parameter-free statistical ideas. At the same time it shows how functional data can be studied through parameter-free statistical ideas, and offers an original presentation of new nonparametric statistical methods for functional data analysis.
This book explains in simple settings the fundamental ideas of financial market modelling and derivative pricing, using the no-arbitrage principle. Relatively elementary mathematics leads to powerful notions and techniques - such as viability, completeness, self-financing and replicating strategies, arbitrage and equivalent martingale measures - which are directly applicable in practice. The general methods are applied in detail to pricing and hedging European and American options within the Cox-Ross-Rubinstein (CRR) binomial tree model. A simple approach to discrete interest rate models is included, which, though elementary, has some novel features. All proofs are written in a user-friendly manner, with each step carefully explained and following a natural flow of thought. In this way the student learns how to tackle new problems.
"Introductory Econometrics: Intuition, Proof, and Practice"
attempts to distill econometrics into a form that preserves its
essence, but that is acceptable--and even appealing--to the
student's intellectual palate. This book insists on rigor when it
is essential, but it emphasizes intuition and seizes upon
entertainment wherever possible. |
You may like...
Lied Vir Sarah - Lesse Van My Ma
Jonathan Jansen, Naomi Jansen
Hardcover
(1)
|