Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
This book aims at meeting the growing demand in the field by introducing the basic spatial econometrics methodologies to a wide variety of researchers. It provides a practical guide that illustrates the potential of spatial econometric modelling, discusses problems and solutions and interprets empirical results.
The interaction between mathematicians and statisticians reveals to be an effective approach to the analysis of insurance and financial problems, in particular in an operative perspective. The Maf2006 conference, held at the University of Salerno in 2006, had precisely this purpose and the collection published here gathers some of the papers presented at the conference and successively worked out to this aim. They cover a wide variety of subjects in insurance and financial fields.
This book, which was first published in 1980, is concerned with one particular branch of growth theory, namely descriptive growth theory. It is typically assumed in growth theory that both the factors and goods market are perfectly competitive. In particular this implies amongst other things that the reward to each factor is identical in each sector of the economy. In this book the assumption of identical factor rewards is relaxed and the implications of an intersectoral wage differential for economic growth are analysed. There is also some discussion on the short-term and long-run effects of minimum wage legislation on growth. This book will serve as key reading for students of economics.
Creating a Eurasian Union offers a detailed analysis of the economies of the Customs Union of Russia, Belarus, and Kazakhstan and the proposed Eurasian Union. The authors employ econometric analysis of business cycles and cointegration analysis to prove the fragility of the union's potential economic success. By providing a brief description of the economic integration of the former Soviet republics, this pioneering work analyses the on-going trial and error processes of market integration led by Russia. Vymyatnina and Antonova's distinctive argument is the first consistent analysis of the emerging Eurasian Union. They incorporate both a non-technical summary of the integration process and previous research and analytical comments, as well as a thorough empirical analysis of the real data on the economic development of the participating countries, to caution that the speed of integration might undermine the feasibility of the Eurasian Union.
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
The Analytic Hierarchy Process (AHP) is a prominent and powerful tool for making decisions in situations involving multiple objectives. Models, Methods, Concepts and Applications of the Analytic Hierarchy Process, 2nd Edition applies the AHP in order to solve problems focused on the following three themes: economics, the social sciences, and the linking of measurement with human values. For economists, the AHP offers a substantially different approach to dealing with economic problems through ratio scales. Psychologists and political scientists can use the methodology to quantify and derive measurements for intangibles. Meanwhile researchers in the physical and engineering sciences can apply the AHP methods to help resolve the conflicts between hard measurement data and human values. Throughout the book, each of these topics is explored utilizing real life models and examples, relevant to problems in today's society. This new edition has been updated and includes five new chapters that includes discussions of the following: - The eigenvector and why it is necessary - A summary of ongoing research in the Middle East that brings together Israeli and Palestinian scholars to develop concessions from both parties - A look at the Medicare Crisis and how AHP can be used to understand the problems and help develop ideas to solve them.
This book presents a concise introduction to Bartlett and Bartlett-type corrections of statistical tests and bias correction of point estimators. The underlying idea behind both groups of corrections is to obtain higher accuracy in small samples. While the main focus is on corrections that can be analytically derived, the authors also present alternative strategies for improving estimators and tests based on bootstrap, a data resampling technique and discuss concrete applications to several important statistical models.
A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.
First published in 2004, this is a rigorous but user-friendly book on the application of stochastic control theory to economics. A distinctive feature of the book is that mathematical concepts are introduced in a language and terminology familiar to graduate students of economics. The standard topics of many mathematics, economics and finance books are illustrated with real examples documented in the economic literature. Moreover, the book emphasises the dos and don'ts of stochastic calculus, cautioning the reader that certain results and intuitions cherished by many economists do not extend to stochastic models. A special chapter (Chapter 5) is devoted to exploring various methods of finding a closed-form representation of the value function of a stochastic control problem, which is essential for ascertaining the optimal policy functions. The book also includes many practice exercises for the reader. Notes and suggested readings are provided at the end of each chapter for more references and possible extensions.
The book investigates the EU preferential trade policy and, in particular, the impact it had on trade flows from developing countries. It shows that the capability of the "trade as aid" model to deliver its expected benefits to these countries crucially differs between preferential schemes and sectors. The book takes an eclectic but rigorous approach to the econometric analysis by combining different specifications of the gravity model. An in-depth presentation of the gravity model is also included, providing significant insights into the distinctive features of this technique and its state-of-art implementation. The evidence produced in the book is extensively applied to the analysis of the EU preferential policies with substantial suggestions for future improvement. Additional electronic material to replicate the book's analysis (datasets and Gams and Stata 9.0 routines) can be found in the Extra Materials menu on the website of the book.
Figure 1. 1. Map of Great Britain at two different scale levels. (a) Counties, (b)Regions. '-. " Figure 1. 2. Two alternative aggregations of the Italian provincie in 32 larger areas 4 CHAPTER 1 d . , b) Figure 1. 3 Percentage of votes of the Communist Party in the 1987 Italian political elections (a) and percentage of population over 75 years (b) in 1981 Italian Census in 32 polling districts. The polling districts with values above the average are shaded. Figure 1. 4: First order neighbours (a) and second order neighbours (b) of a reference area. INTRODUCTION 5 While there are several other problems relating to the analysis of areal data, the problem of estimating a spatial correlO!J'am merits special attention. The concept of the correlogram has been borrowed in the spatial literature from the time series analysis. Figure l. 4. a shows the first-order neighbours of a reference area, while Figure 1. 4. b displays the second-order neighbours of the same area. Higher-order neighbours can be defined in a similar fashion. While it is clear that the dependence is strongest between immediate neighbouring areas a certain degree of dependence may be present among higher-order neighbours. This has been shown to be an alternative way of look ing at the sca le problem (Cliff and Ord, 1981, p. l 23). However, unlike the case of a time series where each observation depends only on past observations, here dependence extends in all directions.
Often applied econometricians are faced with working with data that is less than ideal. The data may be observed with gaps in it, a model may suggest variables that are observed at different frequencies, and sometimes econometric results are very fragile to the inclusion or omission of just a few observations in the sample. Papers in this volume discuss new econometric techniques for addressing these problems.
Taxpayer compliance is a voluntary activity, and the degree to which the tax system works is affected by taxpayers' knowledge that it is their moral and legal responsibility to pay their taxes. Taxpayers also recognize that they face a lottery in which not all taxpayer noncompliance will ever be detected. In the United States most individuals comply with the tax law, yet the tax gap has grown significantly over time for individual taxpayers. The US Internal Revenue Service attempts to ensure that the minority of taxpayers who are noncompliant pay their fair share with a variety of enforcement tools and penalties. The Causes and Consequences of Income Tax Noncompliance provides a comprehensive summary of the empirical evidence concerning taxpayer noncompliance and presents innovative research with new results on the role of IRS audit and enforcements activities on compliance with federal and state income tax collection. Other issues examined include to what degree taxpayers respond to the threat of civil and criminal enforcement and the important role of the media on taxpayer compliance. This book offers researchers, students, and tax administrators insight into the allocation of taxpayer compliance enforcement and service resources, and suggests policies that will prevent further increases in the tax gap. The book's aggregate data analysis methods have practical applications not only to taxpayer compliance but also to other forms of economic behavior, such as welfare fraud.
The authors, leading researchers in the fields of mathematical economics and methodology, present the first comprehensive synthesis of literature on qualitative and other nonparametric techniques, which are important elements of comparative statics and stability analysis in economic theory. The topics covered show how to assess the comparative statics and stability of economic models without a precise quantitative knowledge of all model components. Applications of the analysis range from determining refutable hypotheses from theory to auditing the solutions of large, computer-based systems. This book discusses in depth the methodology involved in a nonparametric analysis of many neoclassical economic models. Constituting a virtually self-contained manual on such analysis, it provides detailed derivation of necessary and sufficient conditions for the existence of restrictive comparative statics and stability results for a range of specified models. Further, algorithms for applying certain of these conditions are given, with examples, as well as the underlying mathematical approach taken. A large body of research is unified covering issues that have been dealt with piecemeal in scattered but important journal articles by the authors and others. The book will prove invaluable to mathematical economists, mathematicians specializing in matrix or graph theory, applied economists working with large-scale economic models, and advanced students of economics. Originally published in 1999. The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These paperback editions preserve the original texts of these important books while presenting them in durable paperback editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905.
In economics, many quantities are related to each other. Such
economic relations are often much more complex than relations in
science and engineering, where some quantities are independence and
the relation between others can be well approximated by
linear To make economic models more adequate, we need more accurate
techniques for describing dependence. Such techniques are currently
being developed. This book contains description of state-of-the-art
techniques for modeling dependence and economic applications
of
Originally published in 1871, "The Theory of Political Economy" was
the first text to introduce the concept of utility theory into
economics and the use of calculus as a way to simplify and present
economic problems. In this classic work Jevons re-formulated the
central problem of economics as one of how to maximise overall
utility with a set amount of means of production. Utility expressed
as a function became the basis of the a new theory of value which
substantially differs from the classical political economists'
labour theory of value. Drawing on his roots in the natural
sciences Jevons revolutionised the tools and methods associated
with political economy and kick-started the metamorphosis of the
discipline.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
The present book is the offspring of my Habilitation, which is the key to academic tenure in Austria. Legal requirements demand that a Ha bilitation be published and so only seeing it in print marks the real end of this biographical landmark project. From a scientific perspective I may hope to finally reach a broader audience with this book for a criti cal appraisal of the research done. Aside from objectives the book is a reflection of many years of research preceding Habilitation proper in the field of efficiency measurement. Regarding the subject matter the main intention was to fill an important remaining gap in the efficiency analysis literature. Hitherto no technique was available to estimate output-specific efficiencies in a statistically convincing way. This book closes this gap, although some desirable improvements and generalizations of the proposed estimation technique may yet be required, before it will eventually establish as standard tool for efficiency analysis. The likely audience for this book includes professional researchers, who want to enrich their tool set for applied efficiency analysis, as well as students of economics, management science or operations research, in tending to learn more about the potentials of rigorously understood efficiency analysis. But also managers or public officials potentially or dering efficiency studies should benefit from the book by learning about the extended capabilities of efficiency analysis. Just reading the intro duction may change their perception of value for money when it comes to comparative performance measurement."
Covers the key issues required for students wishing to understand and analyse the core empirical issues in economics. It focuses on descriptive statistics, probability concepts and basic econometric techniques and has an accompanying website that contains all the data used in the examples and provides exercises for undertaking original research.
India is one of the major emerging economies of the world and has witnessed tremendous economic growth over the last decades. The reforms in the financial sector were introduced to infuse energy and vibrancy into the process of economic growth. The Indian stock market now has the largest number of listed companies in the world. The phenomenal growth of the Indian equity market and its growing importance in the economy is indicated by the extent of market capitalization and the increasing integration of the Indian economy with the global economy. Various schools of thought explain the behaviour of stock returns. The Efficient Market Theory is the most important theory of the School of Neoclassical Finance based on rational expectation and no-trade argument. The book investigates the growth and efficiency of the Indian stock market in the theoretical framework of the Efficiency Market Hypothesis (EMH). The main objective of the present study is to examine the returns behaviour in the Indian equity market in the changed market environment. A detailed and rigorous analysis, made with the help of the sophisticated time series econometric models, is one of the key elements of this volume. The analysis empirically tests the random walk hypothesis and focuses on issues like nonlinear dynamics, structural breaks and long memory. It uses new and disaggregated data on recent reforms and changes in the market microstructure. The data on various indices including sectoral indices help in measuring the relative efficiency of the market and understanding how liquidity and market capitalization affect the efficiency of the market.
J. S. FLEMMING The Bank of England's role as a leading central bank involves both formal and informal aspects. At a formal level it is an adviser to HM Government, whilst at an informal level it is consulted by domestic and overseas institutions for advice on many areas of economic interest. Such advice must be grounded in an understanding of the workings of the domestic and international economy-a task which becomes ever more difficult with the pace of change both in the economy and in the techniques which are used by professional economists to analyse such changes. The Bank's economists are encouraged to publish their research whenever circumstances permit, whether in refereed journals or in other ways. In particular, we make it a rule that the research underlying the Bank's macroeconometric model, to which outside researchers have access through the ESRC (Economic and Social Research Council) macromodelling bureau, should be adequately explained and documented in published form. This volume expands the commitment to make research which is undertaken within the Economics Division of the Bank of England widely available. Included here are chapters which illustrate the breadth of interests which the Bank seeks to cover. Some of the research is, as would be expected, directly related to the specification of the Bank's model, but other aspects are also well represented.
Based on conference proceedings presented at The Chinese University of Hong Kong in November 2012, Natural Disaster and Reconstruction in Asian Economies offers leading insight into and viewpoints on disasters from scholars and journalists working in Japan, China, the United States, and Southeast Asia.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. "
This textbook on the basics of option pricing is accessible to readers with limited mathematical training. It is for both professional traders and undergraduates studying the basics of finance. Assuming no prior knowledge of probability, Sheldon M. Ross offers clear, simple explanations of arbitrage, the Black-Scholes option pricing formula, and other topics such as utility functions, optimal portfolio selections, and the capital assets pricing model. Among the many new features of this third edition are new chapters on Brownian motion and geometric Brownian motion, stochastic order relations, and stochastic dynamic programming, along with expanded sets of exercises and references for all the chapters. |
You may like...
Two Oceans - A Guide To The Marine Life…
George Branch, Charles Griffiths
Paperback
The Canadian Field-naturalist; 2
John M. (John Montague) 1918- Gillett, Ottawa Field-Naturalists' Club
Hardcover
R844
Discovery Miles 8 440
Mushrooms and Other Fungi of South…
Marieka Gryzenhout, Gary Goldman
Paperback
The Natural History and Antiquities of…
Gilbert 1720-1793 Cn White
Hardcover
R1,117
Discovery Miles 11 170
The Larger Illustrated Guide Sasol Birds…
Ian Sinclair, Phil Hockey
Paperback
|