Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This book discusses the problem of model choice when the statistical models are separate, also called nonnested. Chapter 1 provides an introduction, motivating examples and a general overview of the problem. Chapter 2 presents the classical or frequentist approach to the problem as well as several alternative procedures and their properties. Chapter 3 explores the Bayesian approach, the limitations of the classical Bayes factors and the proposed alternative Bayes factors to overcome these limitations. It also discusses a significance Bayesian procedure. Lastly, Chapter 4 examines the pure likelihood approach. Various real-data examples and computer simulations are provided throughout the text.
This book provides advanced theoretical and applied tools for the implementation of modern micro-econometric techniques in evidence-based program evaluation for the social sciences. The author presents a comprehensive toolbox for designing rigorous and effective ex-post program evaluation using the statistical software package Stata. For each method, a statistical presentation is developed, followed by a practical estimation of the treatment effects. By using both real and simulated data, readers will become familiar with evaluation techniques, such as regression-adjustment, matching, difference-in-differences, instrumental-variables and regression-discontinuity-design and are given practical guidelines for selecting and applying suitable methods for specific policy contexts.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,” held in Bologna on October 2-3, 2013.
This book assesses how efficient primary and upper primary education is across different states of India considering both output oriented and input oriented measures of technical efficiency. It identifies the most important factors that could produce differential efficiency among the states, including the effects of central grants, school-specific infrastructures, social indicators and policy variables, as well as state-specific factors like per-capita net-state-domestic-product from the service sector, inequality in distribution of income (Gini coefficient), the percentage of people living below the poverty line and the density of population. The study covers the period 2005-06 to 2010-11 and all the states and union territories of India, which are categorized into two separate groups, namely: (i) General Category States (GCS); and (ii) Special Category States (SCS) and Union Territories (UT). It uses non-parametric Data Envelopment Analysis (DEA) and obtains the Technology Closeness Ratio (TCR), measuring whether the maximum output producible from an input bundle by a school within a given group is as high as what could be produced if the school could choose to join the other group. The major departure of this book is its approach to estimating technical efficiency (TE), which does not use a single frontier encompassing all the states and UT, as is done in the available literature. Rather, this method assumes that GCS, SCS and UT are not homogeneous and operate under different fiscal and economic conditions.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
The research and its outcomes presented here focus on spatial sampling of agricultural resources. The authors introduce sampling designs and methods for producing accurate estimates of crop production for harvests across different regions and countries. With the help of real and simulated examples performed with the open-source software R, readers will learn about the different phases of spatial data collection. The agricultural data analyzed in this book help policymakers and market stakeholders to monitor the production of agricultural goods and its effects on environment and food safety.
The core methods in today's econometric toolkit are linear regression for statistical control, instrumental variables methods for the analysis of natural experiments, and differences-in-differences methods that exploit policy changes. In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? "Mostly Harmless Econometrics" shows how the basic tools of applied econometrics allow the data to speak. In addition to econometric essentials, "Mostly Harmless Econometrics" covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications"
The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research. A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories.
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
This is a book on deterministic and stochastic Growth Theory and the computational methods needed to produce numerical solutions. Exogenous and endogenous growth models are thoroughly reviewed. Special attention is paid to the use of these models for fiscal and monetary policy analysis. Modern Business Cycle Theory, the New Keynesian Macroeconomics, the class of Dynamic Stochastic General Equilibrium models, can be all considered as special cases of models of economic growth, and they can be analyzed by the theoretical and numerical procedures provided in the textbook. Analytical discussions are presented in full detail. The book is self contained and it is designed so that the student advances in the theoretical and the computational issues in parallel. EXCEL and Matlab files are provided on an accompanying website (see Preface to the Second Edition) to illustrate theoretical results as well as to simulate the effects of economic policy interventions. The structure of these program files is described in "Numerical exercise"-type of sections, where the output of these programs is also interpreted. The second edition corrects a few typographical errors and improves some notation.
This book reflects the state of the art on nonlinear economic dynamics, financial market modelling and quantitative finance. It contains eighteen papers with topics ranging from disequilibrium macroeconomics, monetary dynamics, monopoly, financial market and limit order market models with boundedly rational heterogeneous agents to estimation, time series modelling and empirical analysis and from risk management of interest-rate products, futures price volatility and American option pricing with stochastic volatility to evaluation of risk and derivatives of electricity market. The book illustrates some of the most recent research tools in these areas and will be of interest to economists working in economic dynamics and financial market modelling, to mathematicians who are interested in applying complexity theory to economics and finance and to market practitioners and researchers in quantitative finance interested in limit order, futures and electricity market modelling, derivative pricing and risk management.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Handbook of Field Experiments provides tactics on how to conduct experimental research, also presenting a comprehensive catalog on new results from research and areas that remain to be explored. This updated addition to the series includes an entire chapters on field experiments, the politics and practice of social experiments, the methodology and practice of RCTs, and the econometrics of randomized experiments. These topics apply to a wide variety of fields, from politics, to education, and firm productivity, providing readers with a resource that sheds light on timely issues, such as robustness and external validity. Separating itself from circumscribed debates of specialists, this volume surpasses in usefulness the many journal articles and narrowly-defined books written by practitioners.
Introduction to RATS. Stationary Time-Series. Modeling Volatility. Tests for Trends and Unit Roots. Vector Autoregression Analysis. Cointegration and Error Correction. Statistical Tables. References and Additional Readings.
This book is a comprehensive introduction of the reader into the simulation and modelling techniques and their application in the management of organisations. The book is rooted in the thorough understanding of systems theory applied to organisations and focuses on how this theory can apply to econometric models used in the management of organisations. The econometric models in this book employ linear and dynamic programming, graph theory, queuing theory, game theory, etc. and are presented and analysed in various fields of application, such as investment management, stock management, strategic decision making, management of production costs and the lifecycle costs of quality and non-quality products, production quality Management, etc.
The main purpose of this book is to resolve deficiencies and limitations that currently exist when using Technical Analysis (TA). Particularly, TA is being used either by academics as an "economic test" of the weak-form Efficient Market Hypothesis (EMH) or by practitioners as a main or supplementary tool for deriving trading signals. This book approaches TA in a systematic way utilizing all the available estimation theory and tests. This is achieved through the developing of novel rule-based pattern recognizers, and the implementation of statistical tests for assessing the importance of realized returns. More emphasis is given to technical patterns where subjectivity in their identification process is apparent. Our proposed methodology is based on the algorithmic and thus unbiased pattern recognition. The unified methodological framework presented in this book can serve as a benchmark for both future academic studies that test the null hypothesis of the weak-form EMH and for practitioners that want to embed TA within their trading/investment decision making processes.
In the era of Big Data our society is given the unique opportunity to understand the inner dynamics and behavior of complex socio-economic systems. Advances in the availability of very large databases, in capabilities for massive data mining, as well as progress in complex systems theory, multi-agent simulation and computational social science open the possibility of modeling phenomena never before successfully achieved. This contributed volume from the Perm Winter School address the problems of the mechanisms and statistics of the socio-economics system evolution with a focus on financial markets powered by the high-frequency data analysis.
The purpose of this book is to establish a connection between the traditional field of empirical economic research and the emerging area of empirical financial research and to build a bridge between theoretical developments in these areas and their application in practice. Accordingly, it covers broad topics in the theory and application of both empirical economic and financial research, including analysis of time series and the business cycle; different forecasting methods; new models for volatility, correlation and of high-frequency financial data and new approaches to panel regression, as well as a number of case studies. Most of the contributions reflect the state-of-art on the respective subject. The book offers a valuable reference work for researchers, university instructors, practitioners, government officials and graduate and post-graduate students, as well as an important resource for advanced seminars in empirical economic and financial research.
Econophysics of Games and Social Choices.- Kolkata Paise Restaurant Problem in Some Uniform Learning Strategy Limits.- Cycle Monotonicity in Scheduling Models.- Reinforced Learning in Market Games.- Mechanisms Supporting Cooperation for the Evolutionary Prisoner's Dilemma Games.- Economic Applications of Quantum Information Processing.- Using Many-Body Entanglement for Coordinated Action in Game Theory Problems.- Condensation Phenomena and Pareto Distribution in Disordered Urn Models.- Economic Interactions and the Distribution of Wealth.- Wealth Redistribution in Boltzmann-like Models of Conservative Economies.- Multi-species Models in Econo- and Sociophysics.- The Morphology of Urban Agglomerations for Developing Countries: A Case Study with China.- A Mean-Field Model of Financial Markets: Reproducing Long Tailed Distributions and Volatility Correlations.- Statistical Properties of Fluctuations: A Method to Check Market Behavior.- Modeling Saturation in Industrial Growth.- The Kuznets Curve and the Inequality Process.- Monitoring the Teaching - Learning Process via an Entropy Based Index.- Technology Level in the Industrial Supply Chain: Thermodynamic Concept.- Discussions and Comments in Econophys Kolkata IV.- Contributions to Quantitative Economics.- On Multi-Utility Representation of Equitable Intergenerational Preferences.- Variable Populations and Inequality-Sensitive Ethical Judgments.- A Model of Income Distribution.- Statistical Database of the Indian Economy: Need for New Directions.- Does Parental Education Protect Child Health? Some Evidence from Rural Udaipur.- Food Security and Crop Diversification: Can West Bengal Achieve Both?.- Estimating Equivalence Scales Through Engel Curve Analysis.- Testing for Absolute Convergence: A Panel Data Approach.- Goodwin's Growth Cycles: A Reconsideration.- Human Capital Accumulation, Economic Growth and Educational Subsidy Policy in a Dual Economy.- Arms Trade and Conflict Resolution: A Trade-Theoretic Analysis.- Trade andWage Inequality with Endogenous Skill Formation.- Dominant Strategy Implementation in Multi-unit Allocation Problems.- Allocation through Reduction on Minimum Cost Spanning Tree Games.- Unmediated and Mediated Communication Equilibria of Battle of the Sexes with Incomplete Information.- A Characterization Result on the Coincidence of the Prenucleolus and the Shapley Value.- The Ordinal Equivalence of the Johnston Index and the Established Notions of Power.- Reflecting on Market Size and Entry under Oligopoly.
Though globalisation of the world economy is currently a powerful force, people’s international mobility appears to still be very limited. The goal of this book is to improve our knowledge of the true effects of migration flows. It includes contributions by prominent academic researchers analysing the socio-economic impact of migration in a variety of contexts: interconnection of people and trade flows, causes and consequences of capital remittances, understanding the macroeconomic impact of migration and the labour market effects of people’s flows. The latest analytical methodologies are employed in all chapters, while interesting policy guidelines emerge from the investigations. The style of the volume makes it accessible for both non-experts and advanced readers interested in this hot topic of today’s world.
The goal of this book is to assess the efficacy of India’s financial deregulation programme by analyzing the developments in cost efficiency and total factor productivity growth across different ownership types and size classes in the banking sector over the post-deregulation years. The work also gauges the impact of inclusion or exclusion of a proxy for non-traditional activities on the cost efficiency estimates for Indian banks, and ranking of distinct ownership groups. It also investigates the hitherto neglected aspect of the nature of returns-to-scale in the Indian banking industry. In addition, the work explores the key bank-specific factors that explain the inter-bank variations in efficiency and productivity growth. Overall, the empirical results of this work allow us to ascertain whether the gradualist approach to reforming the banking system in a developing economy like India has yielded the most significant policy goal of achieving efficiency and productivity gains. The authors believe that the findings of this book could give useful policy directions and suggestions to other developing economies that have embarked on a deregulation path or are contemplating doing so.
From the Introduction: This volume is dedicated to the remarkable career of Professor Peter Schmidt and the role he has played in mentoring us, his PhD students. Peter's accomplishments are legendary among his students and the profession. Each of the papers in this Festschrift is a research work executed by a former PhD student of Peter's, from his days at the University of North Carolina at Chapel Hill to his time at Michigan State University. Most of the papers were presented at The Conference in Honor of Peter Schmidt, June 30 - July 2, 2011. The conference was largely attended by his former students and one current student, who traveled from as far as Europe and Asia to honor Peter. This was a conference to celebrate Peter's contribution to our contributions. By "our contributions" we mean the research papers that make up this Festschrift and the countless other publications by his students represented and not represented in this volume. Peter's students may have their families to thank for much that is positive in their lives. However, if we think about it, our professional lives would not be the same without the lessons and the approaches to decision making that we learned from Peter. We spent our days together at Peter's conference and the months since reminded of these aspects of our personalities and life goals that were enhanced, fostered, and nurtured by the very singular experiences we have had as Peter's students. We recognized in 2011 that it was unlikely we would all be together again to celebrate such a wonderful moment in ours and Peter's lives and pledged then to take full advantage of it. We did then, and we are now in the form of this volume.
In 1945, very early in the history of the development of a rigorous analytical theory of probability, Feller (1945) wrote a paper called "The fundamental limit theorems in probability" in which he set out what he considered to be "the two most important limit theorems in the modern theory of probability: the central limit theorem and the recently discovered ... 'Kolmogoroff's cel ebrated law of the iterated logarithm' ". A little later in the article he added to these, via a charming description, the "little brother (of the central limit theo rem), the weak law of large numbers", and also the strong law of large num bers, which he considers as a close relative of the law of the iterated logarithm. Feller might well have added to these also the beautiful and highly applicable results of renewal theory, which at the time he himself together with eminent colleagues were vigorously producing. Feller's introductory remarks include the visionary: "The history of probability shows that our problems must be treated in their greatest generality: only in this way can we hope to discover the most natural tools and to open channels for new progress. This remark leads naturally to that characteristic of our theory which makes it attractive beyond its importance for various applications: a combination of an amazing generality with algebraic precision.
Economists can use computer algebra systems to manipulate symbolic models, derive numerical computations, and analyze empirical relationships among variables. Maxima is an open-source multi-platform computer algebra system that rivals proprietary software. Maxima's symbolic and computational capabilities enable economists and financial analysts to develop a deeper understanding of models by allowing them to explore the implications of differences in parameter values, providing numerical solutions to problems that would be otherwise intractable, and by providing graphical representations that can guide analysis. This book provides a step-by-step tutorial for using this program to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques. Readers learn how to phrase the relevant analysis and how symbolic expressions, numerical computations, and graphical representations can be used to learn from microeconomic models. In particular, comparative statics analysis is facilitated. Little has been published on Maxima and its applications in economics and finance, and this volume will appeal to advanced undergraduates, graduate-level students studying microeconomics, academic researchers in economics and finance, economists, and financial analysts. |
You may like...
Introduction to Econometrics, Global…
James Stock, Mark Watson
Paperback
R2,447
Discovery Miles 24 470
Advances in Longitudinal Data Methods in…
Nicholas Tsounis, Aspasia Vlachvei
Hardcover
R7,157
Discovery Miles 71 570
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
|