![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
How do technology, public works projects, mental health, race, gender, mobility, retirement benefits, and macroeconomic policies affect worker well-being? This volume contains fourteen original chapters utilizing the latest econometric techniques to answer this question. The findings include the following: (1) Technology gains explain over half the decline in U.S. unemployment and over two-thirds the reduction in U.S. inflation. (2) Universal health coverage would reduce U.S. labor force participation by 3.3%. (3) Blacks respond to regional rather than national changes in schooling rates of return, perhaps implying a more local labor market for blacks than whites. (4) Employee motivation enhances labor force participation, on-the-job training, job satisfaction and earnings. (5) Male and female promotion and quit rates are comparable once one controls for individual and job characteristics. (6) Public works programs designed to increase a worker's skills do not always increase reemployment. And (7) U.S. pension wealth increased about 20%-25% over the last two decades.
Models of the American economy exist in government, research institutes, universities, and private corporations. Given the proliferation, it is wise to take stock because these models come from diverse sources and describe different conditions from alternative points of view. They could be saying different things about the economy. The high-level comparative studies in this volume, gathered from several issues of the International Economic Review, with a substantive introduction and the addition of more comparative material, evaluate the performance of eleven models of the American economy: the Wharton Mark Ill Model; Brookings Model; Hickman-Coen Annual Model; Liu-Hwa Monthly Model; Data Resources, Inc. (DRI) Model; Federal Reserve Bank of St. Louis Model; Michigan Quarterly Econometric (MOEM) Model; Wharton Annual and Industry Model; Anticipation Version of the Wharton Mark Ill Model/Fair Model; U.S. Department of Commerce (BEA) Model.Each of the proprietors or builders of these models describes his own system in his own words. These studies come closer than ever before to standardizing model operations for testing purposes.Some of the models are monthly, while others are annual. but the quarterly unit of time is the most frequent. Some are demand oriented, others are supply oriented, and focus on the input-output sectors of the economy. Some use only observed. objective data; others use subjective. anticipatory data. Both large and small models are included. In spite of the diversity, the contributors have cooperated to trace the differences between their models to root causes and to report jointly the results of their research. There are also some general papers that look at model performance from outside the CEME group.
The Handbook is written for academics, researchers, practitioners and advanced graduate students. It has been designed to be read by those new or starting out in the field of spatial analysis as well as by those who are already familiar with the field. The chapters have been written in such a way that readers who are new to the field will gain important overview and insight. At the same time, those readers who are already practitioners in the field will gain through the advanced and/or updated tools and new materials and state-of-the-art developments included. This volume provides an accounting of the diversity of current and emergent approaches, not available elsewhere despite the many excellent journals and te- books that exist. Most of the chapters are original, some few are reprints from the Journal of Geographical Systems, Geographical Analysis, The Review of Regional Studies and Letters of Spatial and Resource Sciences. We let our contributors - velop, from their particular perspective and insights, their own strategies for m- ping the part of terrain for which they were responsible. As the chapters were submitted, we became the first consumers of the project we had initiated. We gained from depth, breadth and distinctiveness of our contributors' insights and, in particular, the presence of links between them.
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
The Analytic Hierarchy Process (AHP) is a prominent and powerful tool for making decisions in situations involving multiple objectives. Models, Methods, Concepts and Applications of the Analytic Hierarchy Process, 2nd Edition applies the AHP in order to solve problems focused on the following three themes: economics, the social sciences, and the linking of measurement with human values. For economists, the AHP offers a substantially different approach to dealing with economic problems through ratio scales. Psychologists and political scientists can use the methodology to quantify and derive measurements for intangibles. Meanwhile researchers in the physical and engineering sciences can apply the AHP methods to help resolve the conflicts between hard measurement data and human values. Throughout the book, each of these topics is explored utilizing real life models and examples, relevant to problems in today's society. This new edition has been updated and includes five new chapters that includes discussions of the following: - The eigenvector and why it is necessary - A summary of ongoing research in the Middle East that brings together Israeli and Palestinian scholars to develop concessions from both parties - A look at the Medicare Crisis and how AHP can be used to understand the problems and help develop ideas to solve them.
The interaction between mathematicians and statisticians reveals to be an effective approach to the analysis of insurance and financial problems, in particular in an operative perspective. The Maf2006 conference, held at the University of Salerno in 2006, had precisely this purpose and the collection published here gathers some of the papers presented at the conference and successively worked out to this aim. They cover a wide variety of subjects in insurance and financial fields.
This book, which was first published in 1980, is concerned with one particular branch of growth theory, namely descriptive growth theory. It is typically assumed in growth theory that both the factors and goods market are perfectly competitive. In particular this implies amongst other things that the reward to each factor is identical in each sector of the economy. In this book the assumption of identical factor rewards is relaxed and the implications of an intersectoral wage differential for economic growth are analysed. There is also some discussion on the short-term and long-run effects of minimum wage legislation on growth. This book will serve as key reading for students of economics.
Figure 1. 1. Map of Great Britain at two different scale levels. (a) Counties, (b)Regions. '-. " Figure 1. 2. Two alternative aggregations of the Italian provincie in 32 larger areas 4 CHAPTER 1 d . , b) Figure 1. 3 Percentage of votes of the Communist Party in the 1987 Italian political elections (a) and percentage of population over 75 years (b) in 1981 Italian Census in 32 polling districts. The polling districts with values above the average are shaded. Figure 1. 4: First order neighbours (a) and second order neighbours (b) of a reference area. INTRODUCTION 5 While there are several other problems relating to the analysis of areal data, the problem of estimating a spatial correlO!J'am merits special attention. The concept of the correlogram has been borrowed in the spatial literature from the time series analysis. Figure l. 4. a shows the first-order neighbours of a reference area, while Figure 1. 4. b displays the second-order neighbours of the same area. Higher-order neighbours can be defined in a similar fashion. While it is clear that the dependence is strongest between immediate neighbouring areas a certain degree of dependence may be present among higher-order neighbours. This has been shown to be an alternative way of look ing at the sca le problem (Cliff and Ord, 1981, p. l 23). However, unlike the case of a time series where each observation depends only on past observations, here dependence extends in all directions.
Taxpayer compliance is a voluntary activity, and the degree to which the tax system works is affected by taxpayers' knowledge that it is their moral and legal responsibility to pay their taxes. Taxpayers also recognize that they face a lottery in which not all taxpayer noncompliance will ever be detected. In the United States most individuals comply with the tax law, yet the tax gap has grown significantly over time for individual taxpayers. The US Internal Revenue Service attempts to ensure that the minority of taxpayers who are noncompliant pay their fair share with a variety of enforcement tools and penalties. The Causes and Consequences of Income Tax Noncompliance provides a comprehensive summary of the empirical evidence concerning taxpayer noncompliance and presents innovative research with new results on the role of IRS audit and enforcements activities on compliance with federal and state income tax collection. Other issues examined include to what degree taxpayers respond to the threat of civil and criminal enforcement and the important role of the media on taxpayer compliance. This book offers researchers, students, and tax administrators insight into the allocation of taxpayer compliance enforcement and service resources, and suggests policies that will prevent further increases in the tax gap. The book's aggregate data analysis methods have practical applications not only to taxpayer compliance but also to other forms of economic behavior, such as welfare fraud.
In January 2005, the German government enacted a substantial reform of the welfare system, the so-called "Hartz IV reform". This book evaluates key characteristics of the reform from a microeconometric perspective. It investigates whether a centralized or decentralized organization of welfare administration is more successful to integrate welfare recipients into employment. Moreover, it analyzes the employment effects of an intensified use of benefit sanctions and evaluates the effectiveness and efficiency of the most frequently assigned Active Labor Market Programs. The analyses have a focus on immigrants, who are highly over-represented in the German welfare system.
First published in 2004, this is a rigorous but user-friendly book on the application of stochastic control theory to economics. A distinctive feature of the book is that mathematical concepts are introduced in a language and terminology familiar to graduate students of economics. The standard topics of many mathematics, economics and finance books are illustrated with real examples documented in the economic literature. Moreover, the book emphasises the dos and don'ts of stochastic calculus, cautioning the reader that certain results and intuitions cherished by many economists do not extend to stochastic models. A special chapter (Chapter 5) is devoted to exploring various methods of finding a closed-form representation of the value function of a stochastic control problem, which is essential for ascertaining the optimal policy functions. The book also includes many practice exercises for the reader. Notes and suggested readings are provided at the end of each chapter for more references and possible extensions.
Born of a belief that economic insights should not require much mathematical sophistication, this book proposes novel and parsimonious methods to incorporate ignorance and uncertainty into economic modeling, without complex mathematics. Economics has made great strides over the past several decades in modeling agents' decisions when they are incompletely informed, but many economists believe that there are aspects of these models that are less than satisfactory. Among the concerns are that ignorance is not captured well in most models, that agents' presumed cognitive ability is implausible, and that derived optimal behavior is sometimes driven by the fine details of the model rather than the underlying economics. Compte and Postlewaite lay out a tractable way to address these concerns, and to incorporate plausible limitations on agents' sophistication. A central aspect of the proposed methodology is to restrict the strategies assumed available to agents.
The productivity of a business exerts an important influence on its financial performance. A similar influence exists for industries and economies: those with superior productivity performance thrive at the expense of others. Productivity performance helps explain the growth and demise of businesses and the relative prosperity of nations. Productivity Accounting: The Economics of Business Performance offers an in-depth analysis of variation in business performance, providing the reader with an analytical framework within which to account for this variation and its causes and consequences. The primary focus is the individual business, and the principal consequence of business productivity performance is business financial performance. Alternative measures of financial performance are considered, including profit, profitability, cost, unit cost, and return on assets. Combining analytical rigor with empirical illustrations, the analysis draws on wide-ranging literatures, both historical and current, from business and economics, and explains how businesses create value and distribute it.
A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.
Here is an in-depth guide to the most powerful available benchmarking technique for improving service organization performance - Data Envelopment Analysis (DEA). The book outlines DEA as a benchmarking technique, identifies high cost service units, isolates specific changes for elevating performance to the best practice services level providing high quality service at low cost and most important, it guides the improvement process.
In economics, many quantities are related to each other. Such
economic relations are often much more complex than relations in
science and engineering, where some quantities are independence and
the relation between others can be well approximated by
linear To make economic models more adequate, we need more accurate
techniques for describing dependence. Such techniques are currently
being developed. This book contains description of state-of-the-art
techniques for modeling dependence and economic applications
of
Originally published in 1871, "The Theory of Political Economy" was
the first text to introduce the concept of utility theory into
economics and the use of calculus as a way to simplify and present
economic problems. In this classic work Jevons re-formulated the
central problem of economics as one of how to maximise overall
utility with a set amount of means of production. Utility expressed
as a function became the basis of the a new theory of value which
substantially differs from the classical political economists'
labour theory of value. Drawing on his roots in the natural
sciences Jevons revolutionised the tools and methods associated
with political economy and kick-started the metamorphosis of the
discipline.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
The present book is the offspring of my Habilitation, which is the key to academic tenure in Austria. Legal requirements demand that a Ha bilitation be published and so only seeing it in print marks the real end of this biographical landmark project. From a scientific perspective I may hope to finally reach a broader audience with this book for a criti cal appraisal of the research done. Aside from objectives the book is a reflection of many years of research preceding Habilitation proper in the field of efficiency measurement. Regarding the subject matter the main intention was to fill an important remaining gap in the efficiency analysis literature. Hitherto no technique was available to estimate output-specific efficiencies in a statistically convincing way. This book closes this gap, although some desirable improvements and generalizations of the proposed estimation technique may yet be required, before it will eventually establish as standard tool for efficiency analysis. The likely audience for this book includes professional researchers, who want to enrich their tool set for applied efficiency analysis, as well as students of economics, management science or operations research, in tending to learn more about the potentials of rigorously understood efficiency analysis. But also managers or public officials potentially or dering efficiency studies should benefit from the book by learning about the extended capabilities of efficiency analysis. Just reading the intro duction may change their perception of value for money when it comes to comparative performance measurement."
Covers the key issues required for students wishing to understand and analyse the core empirical issues in economics. It focuses on descriptive statistics, probability concepts and basic econometric techniques and has an accompanying website that contains all the data used in the examples and provides exercises for undertaking original research.
India is one of the major emerging economies of the world and has witnessed tremendous economic growth over the last decades. The reforms in the financial sector were introduced to infuse energy and vibrancy into the process of economic growth. The Indian stock market now has the largest number of listed companies in the world. The phenomenal growth of the Indian equity market and its growing importance in the economy is indicated by the extent of market capitalization and the increasing integration of the Indian economy with the global economy. Various schools of thought explain the behaviour of stock returns. The Efficient Market Theory is the most important theory of the School of Neoclassical Finance based on rational expectation and no-trade argument. The book investigates the growth and efficiency of the Indian stock market in the theoretical framework of the Efficiency Market Hypothesis (EMH). The main objective of the present study is to examine the returns behaviour in the Indian equity market in the changed market environment. A detailed and rigorous analysis, made with the help of the sophisticated time series econometric models, is one of the key elements of this volume. The analysis empirically tests the random walk hypothesis and focuses on issues like nonlinear dynamics, structural breaks and long memory. It uses new and disaggregated data on recent reforms and changes in the market microstructure. The data on various indices including sectoral indices help in measuring the relative efficiency of the market and understanding how liquidity and market capitalization affect the efficiency of the market.
J. S. FLEMMING The Bank of England's role as a leading central bank involves both formal and informal aspects. At a formal level it is an adviser to HM Government, whilst at an informal level it is consulted by domestic and overseas institutions for advice on many areas of economic interest. Such advice must be grounded in an understanding of the workings of the domestic and international economy-a task which becomes ever more difficult with the pace of change both in the economy and in the techniques which are used by professional economists to analyse such changes. The Bank's economists are encouraged to publish their research whenever circumstances permit, whether in refereed journals or in other ways. In particular, we make it a rule that the research underlying the Bank's macroeconometric model, to which outside researchers have access through the ESRC (Economic and Social Research Council) macromodelling bureau, should be adequately explained and documented in published form. This volume expands the commitment to make research which is undertaken within the Economics Division of the Bank of England widely available. Included here are chapters which illustrate the breadth of interests which the Bank seeks to cover. Some of the research is, as would be expected, directly related to the specification of the Bank's model, but other aspects are also well represented.
Based on conference proceedings presented at The Chinese University of Hong Kong in November 2012, Natural Disaster and Reconstruction in Asian Economies offers leading insight into and viewpoints on disasters from scholars and journalists working in Japan, China, the United States, and Southeast Asia.
Creating a Eurasian Union offers a detailed analysis of the economies of the Customs Union of Russia, Belarus, and Kazakhstan and the proposed Eurasian Union. The authors employ econometric analysis of business cycles and cointegration analysis to prove the fragility of the union's potential economic success. By providing a brief description of the economic integration of the former Soviet republics, this pioneering work analyses the on-going trial and error processes of market integration led by Russia. Vymyatnina and Antonova's distinctive argument is the first consistent analysis of the emerging Eurasian Union. They incorporate both a non-technical summary of the integration process and previous research and analytical comments, as well as a thorough empirical analysis of the real data on the economic development of the participating countries, to caution that the speed of integration might undermine the feasibility of the Eurasian Union.
Often applied econometricians are faced with working with data that is less than ideal. The data may be observed with gaps in it, a model may suggest variables that are observed at different frequencies, and sometimes econometric results are very fragile to the inclusion or omission of just a few observations in the sample. Papers in this volume discuss new econometric techniques for addressing these problems. |
![]() ![]() You may like...
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,208
Discovery Miles 32 080
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,881
Discovery Miles 38 810
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Economic Activity, Trade, and Industry…
F. Gerard Adams, Byron Gangnes, …
Hardcover
R2,777
Discovery Miles 27 770
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,790
Discovery Miles 37 900
Advanced Introduction to Spatial…
Daniel A. Griffith, Bin Li
Hardcover
R2,864
Discovery Miles 28 640
|