Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Gini's mean difference (GMD) was first introduced by Corrado Gini in 1912 as an alternative measure of variability. GMD and the parameters which are derived from it (such as the Gini coefficient or the concentration ratio) have been in use in the area of income distribution for almost a century. In practice, the use of GMD as a measure of variability is justified whenever the investigator is not ready to impose, without questioning, the convenient world of normality. This makes the GMD of critical importance in the complex research of statisticians, economists, econometricians, and policy makers. This book focuses on imitating analyses that are based on variance by replacing variance with the GMD and its variants. In this way, the text showcases how almost everything that can be done with the variance as a measure of variability, can be replicated by using Gini. Beyond this, there are marked benefits to utilizing Gini as opposed to other methods. One of the advantages of using Gini methodology is that it provides a unified system that enables the user to learn about various aspects of the underlying distribution. It also provides a systematic method and a unified terminology. Using Gini methodology can reduce the risk of imposing assumptions that are not supported by the data on the model. With these benefits in mind the text uses the covariance-based approach, though applications to other approaches are mentioned as well.
Delving into the connections between renewable energy and economics on an international level, this book focuses specifically on hydropower and geothermal power production for use in the power intensive industry. It takes readily available government and international statistics to provide insight into how businesses and economists can interpret the factors that influence the growth of power intensive industries. It also discusses the CarbFix and SulFix projects that involve the injection of hydrogen sulphide (H2S), and carbon dioxide (CO2) back to reservoir as an emission reduction method. With improved engineering processes, both types of power generation are increasingly subject to economies of scale. These exciting technological developments have a great potential to change the way the world works, as the economy continues to rely so heavily on energy to drive production. Green energy is without a question going to be a major factor in our future, so studying it at its nascence is particularly exciting. This book is intended for academic researchers and students interested in current economic and environmental hot topics, as well as people interested in the inner workings of a possible new investment opportunity.
Written for a broad audience this book offers a comprehensive account of early warning systems for hydro meteorological disasters such as floods and storms, and for geological disasters such as earthquakes. One major theme is the increasingly important role in early warning systems played by the rapidly evolving fields of space and information technology. The authors, all experts in their respective fields, offer a comprehensive and in-depth insight into the current and future perspectives for early warning systems. The text is aimed at decision-makers in the political arena, scientists, engineers and those responsible for public communication and dissemination of warnings.
This book investigates the existence of stochastic and deterministic convergence of real output per worker and the sources of output (physical capital per worker, human capital per worker, total factor productivity -TFP- and average annual hours worked) in 21 OECD countries over the period 1970-2011. Towards this end, the authors apply a large battery of panel unit root and stationarity tests, all of which are robust to the presence of cross-sectional dependence. The evidence fails to provide clear-cut evidence of convergence dynamics either in real GDP per worker or in the series of the sources of output. Due to some limitations associated with second-generation panel unit root and stationarity tests, the authors further use the more flexible PANIC approach which provides evidence that real GDP per worker, real physical capital per worker, human capital and average annual hours exhibit some degree of deterministic convergence, whereas TFP series display a high degree of stochastic convergence.
This volume addresses advanced DEA methodology and techniques developed for modeling unique new performance evaluation issues. Many numerical examples, real management cases and verbal descriptions make it very valuable for researchers and practitioners.
The "Theory of Macrojustice", introduced by S.-C. Kolm, is a stimulating contribution to the debate on the macroeconomic income distribution. The solution called "Equal Labour Income Equalisation" (ELIE) is the result of a three stages construction: collective agreement on the scheme of labour income redistribution, collective agreement on the degree of equalisation to be chosen in that framework, individual freedom to exploit his--her personal productive capicities (the source of labour income and the sole basis for taxation). This book is organised as a discussion around four complementary themes: philosophical aspects of macrojustice, economic analysis of macrojustice, combination of ELIE with other targeted tranfers, econometric evaluations of ELIE.
Here is an in-depth guide to the most powerful available benchmarking technique for improving service organization performance - Data Envelopment Analysis (DEA). The book outlines DEA as a benchmarking technique, identifies high cost service units, isolates specific changes for elevating performance to the best practice services level providing high quality service at low cost and most important, it guides the improvement process.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
How do technology, public works projects, mental health, race, gender, mobility, retirement benefits, and macroeconomic policies affect worker well-being? This volume contains fourteen original chapters utilizing the latest econometric techniques to answer this question. The findings include the following: (1) Technology gains explain over half the decline in U.S. unemployment and over two-thirds the reduction in U.S. inflation. (2) Universal health coverage would reduce U.S. labor force participation by 3.3%. (3) Blacks respond to regional rather than national changes in schooling rates of return, perhaps implying a more local labor market for blacks than whites. (4) Employee motivation enhances labor force participation, on-the-job training, job satisfaction and earnings. (5) Male and female promotion and quit rates are comparable once one controls for individual and job characteristics. (6) Public works programs designed to increase a worker's skills do not always increase reemployment. And (7) U.S. pension wealth increased about 20%-25% over the last two decades.
One of the best known statisticians of the 20th century, Frederick Mosteller has inspired numerous statisticians and other scientists by his creative approach to statistics and its applications. This volume collects 40 of his most original and influential papers, capturing the variety and depth of his writings. It is hoped that sharing these writings with a new generation of researchers will inspire them to build upon his insights and efforts.
The book examines applications in two disparate fields linked by the importance of valuing information: public health and space. Researchers in the health field have developed some of the most innovative methodologies for valuing information, used to help determine, for example, the value of diagnostics in informing patient treatment decisions. In the field of space, recent applications of value-of-information methods are critical for informing decisions on investment in satellites that collect data about air quality, fresh water supplies, climate and other natural and environmental resources affecting global health and quality of life.
This book contains a systematic analysis of allocation rules related to cost and surplus sharing problems. Broadly speaking, it examines various types of rules for allocating a common monetary value (cost) between individual members of a group (or network) when the characteristics of the problem are somehow objectively given. Without being an advanced text it o?ers a comprehensive mathematical analysis of a series of well-known allocation rules. The aim is to provide an overview and synthesis of current kno- edge concerning cost and surplus sharing methods. The text is accompanied by a description of several practical cases and numerous examples designed to make the theoretical results easily comprehensible for both students and practitioners alike. The book is based on a series of lectures given at the University of Copenhagen and Copenhagen Business School for graduate students joining the math/econ program. I am indebted to numerous colleagues, conference participants and s- dents who during the years have shaped my approach and interests through collaboration,commentsandquestionsthatweregreatlyinspiring.Inparti- lar, I would like to thank Hans Keiding, Maurice Koster, Tobias Markeprand, Juan D. Moreno-Ternero, Herv' e Moulin, Bezalel Peleg, Lars Thorlund- Petersen, Jorgen Tind, Mich Tvede and Lars Peter Osterdal.
A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.
In recent years, as part of the increasing "informationization" of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and "laws" akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled "New Approaches to the Analysis of Large-Scale Business and E- nomic Data," held in Tokyo, March 1-5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)-Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
The book investigates the EU preferential trade policy and, in particular, the impact it had on trade flows from developing countries. It shows that the capability of the "trade as aid" model to deliver its expected benefits to these countries crucially differs between preferential schemes and sectors. The book takes an eclectic but rigorous approach to the econometric analysis by combining different specifications of the gravity model. An in-depth presentation of the gravity model is also included, providing significant insights into the distinctive features of this technique and its state-of-art implementation. The evidence produced in the book is extensively applied to the analysis of the EU preferential policies with substantial suggestions for future improvement. Additional electronic material to replicate the book's analysis (datasets and Gams and Stata 9.0 routines) can be found in the Extra Materials menu on the website of the book.
Connections among different assets, asset classes, portfolios, and the stocks of individual institutions are critical in examining financial markets. Interest in financial markets implies interest in underlying macroeconomic fundamentals. In Financial and Macroeconomic Connectedness, Frank Diebold and Kamil Yilmaz propose a simple framework for defining, measuring, and monitoring connectedness, which is central to finance and macroeconomics. These measures of connectedness are theoretically rigorous yet empirically relevant. The approach to connectedness proposed by the authors is intimately related to the familiar econometric notion of variance decomposition. The full set of variance decompositions from vector auto-regressions produces the core of the 'connectedness table.' The connectedness table makes clear how one can begin with the most disaggregated pair-wise directional connectedness measures and aggregate them in various ways to obtain total connectedness measures. The authors also show that variance decompositions define weighted, directed networks, so that these proposed connectedness measures are intimately related to key measures of connectedness used in the network literature. After describing their methods in the first part of the book, the authors proceed to characterize daily return and volatility connectedness across major asset (stock, bond, foreign exchange and commodity) markets as well as the financial institutions within the U.S. and across countries since late 1990s. These specific measures of volatility connectedness show that stock markets played a critical role in spreading the volatility shocks from the U.S. to other countries. Furthermore, while the return connectedness across stock markets increased gradually over time the volatility connectedness measures were subject to significant jumps during major crisis events. This book examines not only financial connectedness, but also real fundamental connectedness. In particular, the authors show that global business cycle connectedness is economically significant and time-varying, that the U.S. has disproportionately high connectedness to others, and that pairwise country connectedness is inversely related to bilateral trade surpluses.
This book is a companion to Baltagi's (2008) leading graduate econometrics textbook on panel data entitled Econometric Analysis of Panel Data, 4 th Edition. The book guides the student of panel data econometrics by solving exercises in a logical and pedagogical manner, helping the reader understand, learn and apply panel data methods. It is also a helpful tool for those who like to learn by solving exercises and running software to replicate empirical studies. It works as a complementary study guide to Baltagi (2008) and also as a stand alone book that builds up the reader's confidence in working out difficult exercises in panel data econometrics and applying these methods to empirical work. The exercises start by providing some background information on partitioned regressions and the Frisch-Waugh-Lovell theorem. Then it goes through the basic material on fixed and random effects models in a one-way and two-way error components models: basic estimation, test of hypotheses and prediction. This include maximum likelihood estimation, testing for poolability of the data, testing for the significance of individual and time effects, as well as Hausman's test for correlated effects. It also provides extensions of panel data techniques to serial correlation, spatial correlation, heteroskedasticity, seemingly unrelated regressions, simultaneous equations, dynamic panel models, incomplete panels, measurement error, count panels, rotating panels, limited dependent variables, and non-stationary panels. The book provides several empirical examples that are useful to applied researchers, illustrating them using Stata and EViews showing the reader how to replicate these studies. The data sets are provided on the Wiley web site: www.wileyeurope.com/college/baltagi .
In macro-econometrics more attention needs to be paid to the relationships among deterministic trends of different variables, or co-trending, especially when economic growth is of concern. The number of relationships, i.e., the co-trending rank, plays an important role in evaluating the veracity of propositions, particularly relating to the Japanese economic growth in view of the structural changes involved within it. This book demonstrates how to determine the co-trending rank from a given set of time series data for different variables. At the same time, the method determines how many of the co-trending relations also represent cointegrations. This enables us to perform statistical inference on the parameters of relations among the deterministic trends. Co-trending is an important contribution to the fields of econometric methods, macroeconomics, and time series analyses.
Up-to-date coverage of most micro-econometric topics; first half parametric, second half semi- (non-) parametric Many empirical examples and tips in applying econometric theories to data Essential ideas and steps shown for most estimators and tests; well-suited for both applied and theoretical readers
Risk and Return in Asian Emerging Markets offers readers a firm insight into the risk and return characteristics of leading Asian emerging market participants by comparing and contrasting behavioral model variables with predictive forecasting methods.
Born of a belief that economic insights should not require much mathematical sophistication, this book proposes novel and parsimonious methods to incorporate ignorance and uncertainty into economic modeling, without complex mathematics. Economics has made great strides over the past several decades in modeling agents' decisions when they are incompletely informed, but many economists believe that there are aspects of these models that are less than satisfactory. Among the concerns are that ignorance is not captured well in most models, that agents' presumed cognitive ability is implausible, and that derived optimal behavior is sometimes driven by the fine details of the model rather than the underlying economics. Compte and Postlewaite lay out a tractable way to address these concerns, and to incorporate plausible limitations on agents' sophistication. A central aspect of the proposed methodology is to restrict the strategies assumed available to agents.
Stochastic Averaging and Extremum Seeking treats methods inspired by attempts to understand the seemingly non-mathematical question of bacterial chemotaxis and their application in other environments. The text presents significant generalizations on existing stochastic averaging theory developed from scratch and necessitated by the need to avoid violation of previous theoretical assumptions by algorithms which are otherwise effective in treating these systems. Coverage is given to four main topics. Stochastic averaging theorems are developed for the analysis of continuous-time nonlinear systems with random forcing, removing prior restrictions on nonlinearity growth and on the finiteness of the time interval. The new stochastic averaging theorems are usable not only as approximation tools but also for providing stability guarantees. Stochastic extremum-seeking algorithms are introduced for optimization of systems without available models. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton). The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles. Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically-plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments. The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models. Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.
This volume is dedicated to two recent intensive areas of research
in the econometrics of panel data, namely nonstationary panels and
dynamic panels. It includes a comprehensive survey of the
nonstationary panel literature including panel unit root tests,
spurious panel regressions and panel cointegration
This book aims at meeting the growing demand in the field by introducing the basic spatial econometrics methodologies to a wide variety of researchers. It provides a practical guide that illustrates the potential of spatial econometric modelling, discusses problems and solutions and interprets empirical results. |
You may like...
Handbook of the Circular Economy
Miguel Brandao, David Lazarevic, …
Hardcover
R7,124
Discovery Miles 71 240
Bioremediation of Salt Affected Soils…
Sanjay Arora, Atul K. Singh, …
Hardcover
Plant Diseases and Food Security in the…
Peter Scott, Richard Strange, …
Hardcover
R3,632
Discovery Miles 36 320
Controlled Release of Pesticides for…
Rakhimol K.R., Sabu Thomas, …
Hardcover
R4,364
Discovery Miles 43 640
Science, Technology, and Innovation for…
Ademola A. Adenle, Marian R. Chertow, …
Hardcover
R3,100
Discovery Miles 31 000
Sustainable Development Goals for…
Godwell Nhamo, David Chikodzi, …
Hardcover
R4,661
Discovery Miles 46 610
|