Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
Here is an in-depth guide to the most powerful available benchmarking technique for improving service organization performance - Data Envelopment Analysis (DEA). The book outlines DEA as a benchmarking technique, identifies high cost service units, isolates specific changes for elevating performance to the best practice services level providing high quality service at low cost and most important, it guides the improvement process.
A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.
This book contains a systematic analysis of allocation rules related to cost and surplus sharing problems. Broadly speaking, it examines various types of rules for allocating a common monetary value (cost) between individual members of a group (or network) when the characteristics of the problem are somehow objectively given. Without being an advanced text it o?ers a comprehensive mathematical analysis of a series of well-known allocation rules. The aim is to provide an overview and synthesis of current kno- edge concerning cost and surplus sharing methods. The text is accompanied by a description of several practical cases and numerous examples designed to make the theoretical results easily comprehensible for both students and practitioners alike. The book is based on a series of lectures given at the University of Copenhagen and Copenhagen Business School for graduate students joining the math/econ program. I am indebted to numerous colleagues, conference participants and s- dents who during the years have shaped my approach and interests through collaboration,commentsandquestionsthatweregreatlyinspiring.Inparti- lar, I would like to thank Hans Keiding, Maurice Koster, Tobias Markeprand, Juan D. Moreno-Ternero, Herv' e Moulin, Bezalel Peleg, Lars Thorlund- Petersen, Jorgen Tind, Mich Tvede and Lars Peter Osterdal.
The book examines applications in two disparate fields linked by the importance of valuing information: public health and space. Researchers in the health field have developed some of the most innovative methodologies for valuing information, used to help determine, for example, the value of diagnostics in informing patient treatment decisions. In the field of space, recent applications of value-of-information methods are critical for informing decisions on investment in satellites that collect data about air quality, fresh water supplies, climate and other natural and environmental resources affecting global health and quality of life.
The Handbook is written for academics, researchers, practitioners and advanced graduate students. It has been designed to be read by those new or starting out in the field of spatial analysis as well as by those who are already familiar with the field. The chapters have been written in such a way that readers who are new to the field will gain important overview and insight. At the same time, those readers who are already practitioners in the field will gain through the advanced and/or updated tools and new materials and state-of-the-art developments included. This volume provides an accounting of the diversity of current and emergent approaches, not available elsewhere despite the many excellent journals and te- books that exist. Most of the chapters are original, some few are reprints from the Journal of Geographical Systems, Geographical Analysis, The Review of Regional Studies and Letters of Spatial and Resource Sciences. We let our contributors - velop, from their particular perspective and insights, their own strategies for m- ping the part of terrain for which they were responsible. As the chapters were submitted, we became the first consumers of the project we had initiated. We gained from depth, breadth and distinctiveness of our contributors' insights and, in particular, the presence of links between them.
The book investigates the EU preferential trade policy and, in particular, the impact it had on trade flows from developing countries. It shows that the capability of the "trade as aid" model to deliver its expected benefits to these countries crucially differs between preferential schemes and sectors. The book takes an eclectic but rigorous approach to the econometric analysis by combining different specifications of the gravity model. An in-depth presentation of the gravity model is also included, providing significant insights into the distinctive features of this technique and its state-of-art implementation. The evidence produced in the book is extensively applied to the analysis of the EU preferential policies with substantial suggestions for future improvement. Additional electronic material to replicate the book's analysis (datasets and Gams and Stata 9.0 routines) can be found in the Extra Materials menu on the website of the book.
Up-to-date coverage of most micro-econometric topics; first half parametric, second half semi- (non-) parametric Many empirical examples and tips in applying econometric theories to data Essential ideas and steps shown for most estimators and tests; well-suited for both applied and theoretical readers
In macro-econometrics more attention needs to be paid to the relationships among deterministic trends of different variables, or co-trending, especially when economic growth is of concern. The number of relationships, i.e., the co-trending rank, plays an important role in evaluating the veracity of propositions, particularly relating to the Japanese economic growth in view of the structural changes involved within it. This book demonstrates how to determine the co-trending rank from a given set of time series data for different variables. At the same time, the method determines how many of the co-trending relations also represent cointegrations. This enables us to perform statistical inference on the parameters of relations among the deterministic trends. Co-trending is an important contribution to the fields of econometric methods, macroeconomics, and time series analyses.
In January 2005, the German government enacted a substantial reform of the welfare system, the so-called "Hartz IV reform". This book evaluates key characteristics of the reform from a microeconometric perspective. It investigates whether a centralized or decentralized organization of welfare administration is more successful to integrate welfare recipients into employment. Moreover, it analyzes the employment effects of an intensified use of benefit sanctions and evaluates the effectiveness and efficiency of the most frequently assigned Active Labor Market Programs. The analyses have a focus on immigrants, who are highly over-represented in the German welfare system.
This book is intended to provide the reader with a firm conceptual and empirical understanding of basic information-theoretic econometric models and methods. Because most data are observational, practitioners work with indirect noisy observations and ill-posed econometric models in the form of stochastic inverse problems. Consequently, traditional econometric methods in many cases are not applicable for answering many of the quantitative questions that analysts wish to ask. After initial chapters deal with parametric and semiparametric linear probability models, the focus turns to solving nonparametric stochastic inverse problems. In succeeding chapters, a family of power divergence measure likelihood functions are introduced for a range of traditional and nontraditional econometric-model problems. Finally, within either an empirical maximum likelihood or loss context, Ron C. Mittelhammer and George G. Judge suggest a basis for choosing a member of the divergence family.
Risk and Return in Asian Emerging Markets offers readers a firm insight into the risk and return characteristics of leading Asian emerging market participants by comparing and contrasting behavioral model variables with predictive forecasting methods.
This volume is dedicated to two recent intensive areas of research
in the econometrics of panel data, namely nonstationary panels and
dynamic panels. It includes a comprehensive survey of the
nonstationary panel literature including panel unit root tests,
spurious panel regressions and panel cointegration
This book, which was first published in 1980, is concerned with one particular branch of growth theory, namely descriptive growth theory. It is typically assumed in growth theory that both the factors and goods market are perfectly competitive. In particular this implies amongst other things that the reward to each factor is identical in each sector of the economy. In this book the assumption of identical factor rewards is relaxed and the implications of an intersectoral wage differential for economic growth are analysed. There is also some discussion on the short-term and long-run effects of minimum wage legislation on growth. This book will serve as key reading for students of economics.
Stochastic Averaging and Extremum Seeking treats methods inspired by attempts to understand the seemingly non-mathematical question of bacterial chemotaxis and their application in other environments. The text presents significant generalizations on existing stochastic averaging theory developed from scratch and necessitated by the need to avoid violation of previous theoretical assumptions by algorithms which are otherwise effective in treating these systems. Coverage is given to four main topics. Stochastic averaging theorems are developed for the analysis of continuous-time nonlinear systems with random forcing, removing prior restrictions on nonlinearity growth and on the finiteness of the time interval. The new stochastic averaging theorems are usable not only as approximation tools but also for providing stability guarantees. Stochastic extremum-seeking algorithms are introduced for optimization of systems without available models. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton). The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles. Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically-plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments. The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models. Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.
Although geometry has always aided intuition in econometrics, more recently differential geometry has become a standard tool in the analysis of statistical models, offering a deeper appreciation of existing methodologies and highlighting the essential issues which can be hidden in an algebraic development of a problem. Originally published in 2000, this volume was an early example of the application of these techniques to econometrics. An introductory chapter provides a brief tutorial for those unfamiliar with the tools of Differential Geometry. The topics covered in the following chapters demonstrate the power of the geometric method to provide practical solutions and insight into problems of econometric inference.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
This book aims at meeting the growing demand in the field by introducing the basic spatial econometrics methodologies to a wide variety of researchers. It provides a practical guide that illustrates the potential of spatial econometric modelling, discusses problems and solutions and interprets empirical results.
Creating a Eurasian Union offers a detailed analysis of the economies of the Customs Union of Russia, Belarus, and Kazakhstan and the proposed Eurasian Union. The authors employ econometric analysis of business cycles and cointegration analysis to prove the fragility of the union's potential economic success. By providing a brief description of the economic integration of the former Soviet republics, this pioneering work analyses the on-going trial and error processes of market integration led by Russia. Vymyatnina and Antonova's distinctive argument is the first consistent analysis of the emerging Eurasian Union. They incorporate both a non-technical summary of the integration process and previous research and analytical comments, as well as a thorough empirical analysis of the real data on the economic development of the participating countries, to caution that the speed of integration might undermine the feasibility of the Eurasian Union.
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
The Analytic Hierarchy Process (AHP) is a prominent and powerful tool for making decisions in situations involving multiple objectives. Models, Methods, Concepts and Applications of the Analytic Hierarchy Process, 2nd Edition applies the AHP in order to solve problems focused on the following three themes: economics, the social sciences, and the linking of measurement with human values. For economists, the AHP offers a substantially different approach to dealing with economic problems through ratio scales. Psychologists and political scientists can use the methodology to quantify and derive measurements for intangibles. Meanwhile researchers in the physical and engineering sciences can apply the AHP methods to help resolve the conflicts between hard measurement data and human values. Throughout the book, each of these topics is explored utilizing real life models and examples, relevant to problems in today's society. This new edition has been updated and includes five new chapters that includes discussions of the following: - The eigenvector and why it is necessary - A summary of ongoing research in the Middle East that brings together Israeli and Palestinian scholars to develop concessions from both parties - A look at the Medicare Crisis and how AHP can be used to understand the problems and help develop ideas to solve them.
This book presents a concise introduction to Bartlett and Bartlett-type corrections of statistical tests and bias correction of point estimators. The underlying idea behind both groups of corrections is to obtain higher accuracy in small samples. While the main focus is on corrections that can be analytically derived, the authors also present alternative strategies for improving estimators and tests based on bootstrap, a data resampling technique and discuss concrete applications to several important statistical models.
Published in 1932, this is the third edition of an original 1922 volume. The 1922 volume was, in turn, created as the replacement for the Institute of Actuaries Textbook, Part Three, which was the foremost source of knowledge on the subject of life contingencies for over 35 years. Assuming a high level of mathematical knowledge on the part of the reader, it was aimed chiefly at actuarial students and those with a professional interest in the relationship between statistics and mortality. Highly organised and containing numerous mathematical formulae, this book will remain of value to anyone with an interest in risk calculation and the development of the insurance industry.
Often applied econometricians are faced with working with data that is less than ideal. The data may be observed with gaps in it, a model may suggest variables that are observed at different frequencies, and sometimes econometric results are very fragile to the inclusion or omission of just a few observations in the sample. Papers in this volume discuss new econometric techniques for addressing these problems.
Figure 1. 1. Map of Great Britain at two different scale levels. (a) Counties, (b)Regions. '-. " Figure 1. 2. Two alternative aggregations of the Italian provincie in 32 larger areas 4 CHAPTER 1 d . , b) Figure 1. 3 Percentage of votes of the Communist Party in the 1987 Italian political elections (a) and percentage of population over 75 years (b) in 1981 Italian Census in 32 polling districts. The polling districts with values above the average are shaded. Figure 1. 4: First order neighbours (a) and second order neighbours (b) of a reference area. INTRODUCTION 5 While there are several other problems relating to the analysis of areal data, the problem of estimating a spatial correlO!J'am merits special attention. The concept of the correlogram has been borrowed in the spatial literature from the time series analysis. Figure l. 4. a shows the first-order neighbours of a reference area, while Figure 1. 4. b displays the second-order neighbours of the same area. Higher-order neighbours can be defined in a similar fashion. While it is clear that the dependence is strongest between immediate neighbouring areas a certain degree of dependence may be present among higher-order neighbours. This has been shown to be an alternative way of look ing at the sca le problem (Cliff and Ord, 1981, p. l 23). However, unlike the case of a time series where each observation depends only on past observations, here dependence extends in all directions.
Taxpayer compliance is a voluntary activity, and the degree to which the tax system works is affected by taxpayers' knowledge that it is their moral and legal responsibility to pay their taxes. Taxpayers also recognize that they face a lottery in which not all taxpayer noncompliance will ever be detected. In the United States most individuals comply with the tax law, yet the tax gap has grown significantly over time for individual taxpayers. The US Internal Revenue Service attempts to ensure that the minority of taxpayers who are noncompliant pay their fair share with a variety of enforcement tools and penalties. The Causes and Consequences of Income Tax Noncompliance provides a comprehensive summary of the empirical evidence concerning taxpayer noncompliance and presents innovative research with new results on the role of IRS audit and enforcements activities on compliance with federal and state income tax collection. Other issues examined include to what degree taxpayers respond to the threat of civil and criminal enforcement and the important role of the media on taxpayer compliance. This book offers researchers, students, and tax administrators insight into the allocation of taxpayer compliance enforcement and service resources, and suggests policies that will prevent further increases in the tax gap. The book's aggregate data analysis methods have practical applications not only to taxpayer compliance but also to other forms of economic behavior, such as welfare fraud. |
You may like...
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Statistics for Business & Economics…
James McClave, P Benson, …
Paperback
R2,304
Discovery Miles 23 040
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
R2,178
Discovery Miles 21 780
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
(2)
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,397
Discovery Miles 23 970
|