![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.
This book provides advanced theoretical and applied tools for the implementation of modern micro-econometric techniques in evidence-based program evaluation for the social sciences. The author presents a comprehensive toolbox for designing rigorous and effective ex-post program evaluation using the statistical software package Stata. For each method, a statistical presentation is developed, followed by a practical estimation of the treatment effects. By using both real and simulated data, readers will become familiar with evaluation techniques, such as regression-adjustment, matching, difference-in-differences, instrumental-variables and regression-discontinuity-design and are given practical guidelines for selecting and applying suitable methods for specific policy contexts.
This book is an introduction to regression analysis, focusing on the practicalities of doing regression analysis on real-life data. Contrary to other textbooks on regression, this book is based on the idea that you do not necessarily need to know much about statistics and mathematics to get a firm grip on regression and perform it to perfection. This non-technical point of departure is complemented by practical examples of real-life data analysis using statistics software such as Stata, R and SPSS. Parts 1 and 2 of the book cover the basics, such as simple linear regression, multiple linear regression, how to interpret the output from statistics programs, significance testing and the key regression assumptions. Part 3 deals with how to practically handle violations of the classical linear regression assumptions, regression modeling for categorical y-variables and instrumental variable (IV) regression. Part 4 puts the various purposes of, or motivations for, regression into the wider context of writing a scholarly report and points to some extensions to related statistical techniques. This book is written primarily for those who need to do regression analysis in practice, and not only to understand how this method works in theory. The book's accessible approach is recommended for students from across the social sciences.
This book discusses the integrated concepts of statistical quality engineering and management tools. It will help readers to understand and apply the concepts of quality through project management and technical analysis, using statistical methods. Prepared in a ready-to-use form, the text will equip practitioners to implement the Six Sigma principles in projects. The concepts discussed are all critically assessed and explained, allowing them to be practically applied in managerial decision-making, and in each chapter, the objectives and connections to the rest of the work are clearly illustrated. To aid in understanding, the book includes a wealth of tables, graphs, descriptions and checklists, as well as charts and plots, worked-out examples and exercises. Perhaps the most unique feature of the book is its approach, using statistical tools, to explain the science behind Six Sigma project management and integrated in engineering concepts. The material on quality engineering and statistical management tools offers valuable support for undergraduate, postgraduate and research students. The book can also serve as a concise guide for Six Sigma professionals, Green Belt, Black Belt and Master Black Belt trainers.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,” held in Bologna on October 2-3, 2013.
This book assesses how efficient primary and upper primary education is across different states of India considering both output oriented and input oriented measures of technical efficiency. It identifies the most important factors that could produce differential efficiency among the states, including the effects of central grants, school-specific infrastructures, social indicators and policy variables, as well as state-specific factors like per-capita net-state-domestic-product from the service sector, inequality in distribution of income (Gini coefficient), the percentage of people living below the poverty line and the density of population. The study covers the period 2005-06 to 2010-11 and all the states and union territories of India, which are categorized into two separate groups, namely: (i) General Category States (GCS); and (ii) Special Category States (SCS) and Union Territories (UT). It uses non-parametric Data Envelopment Analysis (DEA) and obtains the Technology Closeness Ratio (TCR), measuring whether the maximum output producible from an input bundle by a school within a given group is as high as what could be produced if the school could choose to join the other group. The major departure of this book is its approach to estimating technical efficiency (TE), which does not use a single frontier encompassing all the states and UT, as is done in the available literature. Rather, this method assumes that GCS, SCS and UT are not homogeneous and operate under different fiscal and economic conditions.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
The research and its outcomes presented here focus on spatial sampling of agricultural resources. The authors introduce sampling designs and methods for producing accurate estimates of crop production for harvests across different regions and countries. With the help of real and simulated examples performed with the open-source software R, readers will learn about the different phases of spatial data collection. The agricultural data analyzed in this book help policymakers and market stakeholders to monitor the production of agricultural goods and its effects on environment and food safety.
The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research. A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories.
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
This is a book on deterministic and stochastic Growth Theory and the computational methods needed to produce numerical solutions. Exogenous and endogenous growth models are thoroughly reviewed. Special attention is paid to the use of these models for fiscal and monetary policy analysis. Modern Business Cycle Theory, the New Keynesian Macroeconomics, the class of Dynamic Stochastic General Equilibrium models, can be all considered as special cases of models of economic growth, and they can be analyzed by the theoretical and numerical procedures provided in the textbook. Analytical discussions are presented in full detail. The book is self contained and it is designed so that the student advances in the theoretical and the computational issues in parallel. EXCEL and Matlab files are provided on an accompanying website (see Preface to the Second Edition) to illustrate theoretical results as well as to simulate the effects of economic policy interventions. The structure of these program files is described in "Numerical exercise"-type of sections, where the output of these programs is also interpreted. The second edition corrects a few typographical errors and improves some notation.
This book reflects the state of the art on nonlinear economic dynamics, financial market modelling and quantitative finance. It contains eighteen papers with topics ranging from disequilibrium macroeconomics, monetary dynamics, monopoly, financial market and limit order market models with boundedly rational heterogeneous agents to estimation, time series modelling and empirical analysis and from risk management of interest-rate products, futures price volatility and American option pricing with stochastic volatility to evaluation of risk and derivatives of electricity market. The book illustrates some of the most recent research tools in these areas and will be of interest to economists working in economic dynamics and financial market modelling, to mathematicians who are interested in applying complexity theory to economics and finance and to market practitioners and researchers in quantitative finance interested in limit order, futures and electricity market modelling, derivative pricing and risk management.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
This volume systematically details both the basic principles and new developments in Data Envelopment Analysis (DEA), offering a solid understanding of the methodology, its uses, and its potential. New material in this edition includes coverage of recent developments that have greatly extended the power and scope of DEA and have lead to new directions for research and DEA uses. Each chapter accompanies its developments with simple numerical examples and discussions of actual applications. The first nine chapters cover the basic principles of DEA, while the final seven chapters provide a more advanced treatment.
Who decides how official statistics are produced? Do politicians have control or are key decisions left to statisticians in independent statistical agencies? Interviews with statisticians in Australia, Canada, Sweden, the UK and the USA were conducted to get insider perspectives on the nature of decision making in government statistical administration. While the popular adage suggests there are 'lies, damned lies and statistics', this research shows that official statistics in liberal democracies are far from mistruths; they are consistently insulated from direct political interference. Yet, a range of subtle pressures and tensions exist that governments and statisticians must manage. The power over statistics is distributed differently in different countries, and this book explains why. Differences in decision-making powers across countries are the result of shifting pressures politicians and statisticians face to be credible, and the different national contexts that provide distinctive institutional settings for the production of government numbers.
This book is a comprehensive introduction of the reader into the simulation and modelling techniques and their application in the management of organisations. The book is rooted in the thorough understanding of systems theory applied to organisations and focuses on how this theory can apply to econometric models used in the management of organisations. The econometric models in this book employ linear and dynamic programming, graph theory, queuing theory, game theory, etc. and are presented and analysed in various fields of application, such as investment management, stock management, strategic decision making, management of production costs and the lifecycle costs of quality and non-quality products, production quality Management, etc.
The main purpose of this book is to resolve deficiencies and limitations that currently exist when using Technical Analysis (TA). Particularly, TA is being used either by academics as an "economic test" of the weak-form Efficient Market Hypothesis (EMH) or by practitioners as a main or supplementary tool for deriving trading signals. This book approaches TA in a systematic way utilizing all the available estimation theory and tests. This is achieved through the developing of novel rule-based pattern recognizers, and the implementation of statistical tests for assessing the importance of realized returns. More emphasis is given to technical patterns where subjectivity in their identification process is apparent. Our proposed methodology is based on the algorithmic and thus unbiased pattern recognition. The unified methodological framework presented in this book can serve as a benchmark for both future academic studies that test the null hypothesis of the weak-form EMH and for practitioners that want to embed TA within their trading/investment decision making processes.
In the era of Big Data our society is given the unique opportunity to understand the inner dynamics and behavior of complex socio-economic systems. Advances in the availability of very large databases, in capabilities for massive data mining, as well as progress in complex systems theory, multi-agent simulation and computational social science open the possibility of modeling phenomena never before successfully achieved. This contributed volume from the Perm Winter School address the problems of the mechanisms and statistics of the socio-economics system evolution with a focus on financial markets powered by the high-frequency data analysis.
The purpose of this book is to establish a connection between the traditional field of empirical economic research and the emerging area of empirical financial research and to build a bridge between theoretical developments in these areas and their application in practice. Accordingly, it covers broad topics in the theory and application of both empirical economic and financial research, including analysis of time series and the business cycle; different forecasting methods; new models for volatility, correlation and of high-frequency financial data and new approaches to panel regression, as well as a number of case studies. Most of the contributions reflect the state-of-art on the respective subject. The book offers a valuable reference work for researchers, university instructors, practitioners, government officials and graduate and post-graduate students, as well as an important resource for advanced seminars in empirical economic and financial research.
Econophysics of Games and Social Choices.- Kolkata Paise Restaurant Problem in Some Uniform Learning Strategy Limits.- Cycle Monotonicity in Scheduling Models.- Reinforced Learning in Market Games.- Mechanisms Supporting Cooperation for the Evolutionary Prisoner's Dilemma Games.- Economic Applications of Quantum Information Processing.- Using Many-Body Entanglement for Coordinated Action in Game Theory Problems.- Condensation Phenomena and Pareto Distribution in Disordered Urn Models.- Economic Interactions and the Distribution of Wealth.- Wealth Redistribution in Boltzmann-like Models of Conservative Economies.- Multi-species Models in Econo- and Sociophysics.- The Morphology of Urban Agglomerations for Developing Countries: A Case Study with China.- A Mean-Field Model of Financial Markets: Reproducing Long Tailed Distributions and Volatility Correlations.- Statistical Properties of Fluctuations: A Method to Check Market Behavior.- Modeling Saturation in Industrial Growth.- The Kuznets Curve and the Inequality Process.- Monitoring the Teaching - Learning Process via an Entropy Based Index.- Technology Level in the Industrial Supply Chain: Thermodynamic Concept.- Discussions and Comments in Econophys Kolkata IV.- Contributions to Quantitative Economics.- On Multi-Utility Representation of Equitable Intergenerational Preferences.- Variable Populations and Inequality-Sensitive Ethical Judgments.- A Model of Income Distribution.- Statistical Database of the Indian Economy: Need for New Directions.- Does Parental Education Protect Child Health? Some Evidence from Rural Udaipur.- Food Security and Crop Diversification: Can West Bengal Achieve Both?.- Estimating Equivalence Scales Through Engel Curve Analysis.- Testing for Absolute Convergence: A Panel Data Approach.- Goodwin's Growth Cycles: A Reconsideration.- Human Capital Accumulation, Economic Growth and Educational Subsidy Policy in a Dual Economy.- Arms Trade and Conflict Resolution: A Trade-Theoretic Analysis.- Trade andWage Inequality with Endogenous Skill Formation.- Dominant Strategy Implementation in Multi-unit Allocation Problems.- Allocation through Reduction on Minimum Cost Spanning Tree Games.- Unmediated and Mediated Communication Equilibria of Battle of the Sexes with Incomplete Information.- A Characterization Result on the Coincidence of the Prenucleolus and the Shapley Value.- The Ordinal Equivalence of the Johnston Index and the Established Notions of Power.- Reflecting on Market Size and Entry under Oligopoly.
Though globalisation of the world economy is currently a powerful force, people’s international mobility appears to still be very limited. The goal of this book is to improve our knowledge of the true effects of migration flows. It includes contributions by prominent academic researchers analysing the socio-economic impact of migration in a variety of contexts: interconnection of people and trade flows, causes and consequences of capital remittances, understanding the macroeconomic impact of migration and the labour market effects of people’s flows. The latest analytical methodologies are employed in all chapters, while interesting policy guidelines emerge from the investigations. The style of the volume makes it accessible for both non-experts and advanced readers interested in this hot topic of today’s world.
The goal of this book is to assess the efficacy of India’s financial deregulation programme by analyzing the developments in cost efficiency and total factor productivity growth across different ownership types and size classes in the banking sector over the post-deregulation years. The work also gauges the impact of inclusion or exclusion of a proxy for non-traditional activities on the cost efficiency estimates for Indian banks, and ranking of distinct ownership groups. It also investigates the hitherto neglected aspect of the nature of returns-to-scale in the Indian banking industry. In addition, the work explores the key bank-specific factors that explain the inter-bank variations in efficiency and productivity growth. Overall, the empirical results of this work allow us to ascertain whether the gradualist approach to reforming the banking system in a developing economy like India has yielded the most significant policy goal of achieving efficiency and productivity gains. The authors believe that the findings of this book could give useful policy directions and suggestions to other developing economies that have embarked on a deregulation path or are contemplating doing so.
From the Introduction: This volume is dedicated to the remarkable career of Professor Peter Schmidt and the role he has played in mentoring us, his PhD students. Peter's accomplishments are legendary among his students and the profession. Each of the papers in this Festschrift is a research work executed by a former PhD student of Peter's, from his days at the University of North Carolina at Chapel Hill to his time at Michigan State University. Most of the papers were presented at The Conference in Honor of Peter Schmidt, June 30 - July 2, 2011. The conference was largely attended by his former students and one current student, who traveled from as far as Europe and Asia to honor Peter. This was a conference to celebrate Peter's contribution to our contributions. By "our contributions" we mean the research papers that make up this Festschrift and the countless other publications by his students represented and not represented in this volume. Peter's students may have their families to thank for much that is positive in their lives. However, if we think about it, our professional lives would not be the same without the lessons and the approaches to decision making that we learned from Peter. We spent our days together at Peter's conference and the months since reminded of these aspects of our personalities and life goals that were enhanced, fostered, and nurtured by the very singular experiences we have had as Peter's students. We recognized in 2011 that it was unlikely we would all be together again to celebrate such a wonderful moment in ours and Peter's lives and pledged then to take full advantage of it. We did then, and we are now in the form of this volume.
In 1945, very early in the history of the development of a rigorous analytical theory of probability, Feller (1945) wrote a paper called "The fundamental limit theorems in probability" in which he set out what he considered to be "the two most important limit theorems in the modern theory of probability: the central limit theorem and the recently discovered ... 'Kolmogoroff's cel ebrated law of the iterated logarithm' ". A little later in the article he added to these, via a charming description, the "little brother (of the central limit theo rem), the weak law of large numbers", and also the strong law of large num bers, which he considers as a close relative of the law of the iterated logarithm. Feller might well have added to these also the beautiful and highly applicable results of renewal theory, which at the time he himself together with eminent colleagues were vigorously producing. Feller's introductory remarks include the visionary: "The history of probability shows that our problems must be treated in their greatest generality: only in this way can we hope to discover the most natural tools and to open channels for new progress. This remark leads naturally to that characteristic of our theory which makes it attractive beyond its importance for various applications: a combination of an amazing generality with algebraic precision. |
![]() ![]() You may like...
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,347
Discovery Miles 43 470
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
![]() R662 Discovery Miles 6 620
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,355
Discovery Miles 33 550
|