![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Microsimulation models provide an exciting new tool for analysing the distributional impact and cost of government policy changes. They can also be used to analyse the current or future structure of society. This volume contains papers describing new developments at the frontiers of microsimulation modelling, and draws upon experiences in a wide range of countries. Some papers aim to share with other modellers, experience gained in designing and running microsimulation models and their use in government policy formulation. They also examine issues at the frontiers of the discipline, such as how to include usage of health, education and welfare services in models. Other chapters focus upon describing the innovative new approaches being taken in dynamic microsimulation modelling. They describe some of the policy applications for which dynamic models are being used in Europe, Australia and New Zealand. Topics covered include retirement income modelling, pension reform, the behavioural impact of tax changes, child care demand, and the inclusion of government services within models. Attention is also given to validating the results of models and estimating their statistical reliability.
First published in 1992, The Efficiency of New Issue Markets provides a comprehensive overview of under-pricing and through this assess the efficiency of new issue markets. The book provides a further theoretical development of the adverse selection model of the new issue market and addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets which demonstrates the importance of distribution in determining relative efficiency.
This book explores the possibility of using social media data for detecting socio-economic recovery activities. In the last decade, there have been intensive research activities focusing on social media during and after disasters. This approach, which views people's communication on social media as a sensor for real-time situations, has been widely adopted as the "people as sensor" approach. Furthermore, to improve recovery efforts after large-scale disasters, detecting communities' real-time recovery situations is essential, since conventional socio-economic recovery indicators, such as governmental statistics, are not published in real time. Thanks to its timeliness, using social media data can fill the gap. Motivated by this possibility, this book especially focuses on the relationships between people's communication on Twitter and Facebook pages, and socio-economic recovery activities as reflected in the used-car market data and the housing market data in the case of two major disasters: the Great East Japan Earthquake and Tsunami of 2011 and Hurricane Sandy in 2012. The book pursues an interdisciplinary approach, combining e.g. disaster recovery studies, crisis informatics, and economics. In terms of its contributions, firstly, the book sheds light on the "people as sensors" approach for detecting socio-economic recovery activities, which has not been thoroughly studied to date but has the potential to improve situation awareness during the recovery phase. Secondly, the book proposes new socio-economic recovery indicators: used-car market data and housing market data. Thirdly, in the context of using social media during the recovery phase, the results demonstrate the importance of distinguishing between social media data posted both by people who are at or near disaster-stricken areas and by those who are farther away.
This book provides the tools and concepts necessary to study the
behavior of econometric estimators and test statistics in large
samples. An econometric estimator is a solution to an optimization
problem; that is, a problem that requires a body of techniques to
determine a specific solution in a defined set of possible
alternatives that best satisfies a selected object function or set
of constraints. Thus, this highly mathematical book investigates
situations concerning large numbers, in which the assumptions of
the classical linear model fail. Economists, of course, face these
situations often.
This book presents an extensive survey of the theory and empirics of international parity conditions which are critical to our understanding of the linkages between world markets and the movement of interest and exchange rates across countries. The book falls into three parts dealing with the theory, methods of econometric testing and existing empirical evidence. Although it is intended to provide a consensus view on the subject, the authors also make some controversial propositions, particularly on the purchasing power parity conditions.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. Financial Mathematics, Volatility and Covariance Modelling: Volume 2 provides a key repository on the current state of knowledge, the latest debates and recent literature on financial mathematics, volatility and covariance modelling. The first section is devoted to mathematical finance, stochastic modelling and control optimization. Chapters explore the recent financial crisis, the increase of uncertainty and volatility, and propose an alternative approach to deal with these issues. The second section covers financial volatility and covariance modelling and explores proposals for dealing with recent developments in financial econometrics This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
In many applications of econometrics and economics, a large proportion of the questions of interest are identification. An economist may be interested in uncovering the true signal when the data could be very noisy, such as time-series spurious regression and weak instruments problems, to name a few. In this book, High-Dimensional Econometrics and Identification, we illustrate the true signal and, hence, identification can be recovered even with noisy data in high-dimensional data, e.g., large panels. High-dimensional data in econometrics is the rule rather than the exception. One of the tools to analyze large, high-dimensional data is the panel data model.High-Dimensional Econometrics and Identification grew out of research work on the identification and high-dimensional econometrics that we have collaborated on over the years, and it aims to provide an up-todate presentation of the issues of identification and high-dimensional econometrics, as well as insights into the use of these results in empirical studies. This book is designed for high-level graduate courses in econometrics and statistics, as well as used as a reference for researchers.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. "
Volume 1 covers statistical methods related to unit roots, trend breaks and their interplay. Testing for unit roots has been a topic of wide interest and the author was at the forefront of this research. The book covers important topics such as the Phillips-Perron unit root test and theoretical analyses about their properties, how this and other tests could be improved, and ingredients needed to achieve better tests and the proposal of a new class of tests. Also included are theoretical studies related to time series models with unit roots and the effect of span versus sampling interval on the power of the tests. Moreover, this book deals with the issue of trend breaks and their effect on unit root tests. This research agenda fostered by the author showed that trend breaks and unit roots can easily be confused. Hence, the need for new testing procedures, which are covered.Volume 2 is about statistical methods related to structural change in time series models. The approach adopted is off-line whereby one wants to test for structural change using a historical dataset and perform hypothesis testing. A distinctive feature is the allowance for multiple structural changes. The methods discussed have, and continue to be, applied in a variety of fields including economics, finance, life science, physics and climate change. The articles included address issues of estimation, testing and/or inference in a variety of models: short-memory regressors and errors, trends with integrated and/or stationary errors, autoregressions, cointegrated models, multivariate systems of equations, endogenous regressors, long-memory series, among others. Other issues covered include the problems of non-monotonic power and the pitfalls of adopting a local asymptotic framework. Empirical analyses are provided for the US real interest rate, the US GDP, the volatility of asset returns and climate change.
This volume of Advances in Econometrics contains a selection of papers presented at the "Econometrics of Complex Survey Data: Theory and Applications" conference organized by the Bank of Canada, Ottawa, Canada, from October 19-20, 2017. The papers included in this volume span a range of methodological and practical topics including survey collection comparisons, imputation mechanisms, the bootstrap, nonparametric techniques, specification tests, and empirical likelihood estimation using complex survey data. For academics and students with an interest in econometrics and the ways in which complex survey data can be used and evaluated, this volume is essential.
The goal of this book is to assess the efficacy of India's financial deregulation programme by analyzing the developments in cost efficiency and total factor productivity growth across different ownership types and size classes in the banking sector over the post-deregulation years. The work also gauges the impact of inclusion or exclusion of a proxy for non-traditional activities on the cost efficiency estimates for Indian banks, and ranking of distinct ownership groups. It also investigates the hitherto neglected aspect of the nature of returns-to-scale in the Indian banking industry. In addition, the work explores the key bank-specific factors that explain the inter-bank variations in efficiency and productivity growth. Overall, the empirical results of this work allow us to ascertain whether the gradualist approach to reforming the banking system in a developing economy like India has yielded the most significant policy goal of achieving efficiency and productivity gains. The authors believe that the findings of this book could give useful policy directions and suggestions to other developing economies that have embarked on a deregulation path or are contemplating doing so.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. The volume collects Herbert Scarf's papers in the area of Applied Equilibrium Analysis.
This book presents a theory of the general dynamic economic equilibrium which is a development of the static theory of Walras and Pareto. The work has built up an analytical model of the effective, current movement of an economic system, founded on the logic of the individual changing programmes - a basis for finding out the laws of all types of endogenous and exogenous movements of the economy. Indeed, the model can be used in the treatment of the typical problems of dynamic economics, by means of the author's method of variational dynamic analysis.
The book describes the structure of the Keynes-Leontief Model (KLM) of Japan and discusses how the Japanese economy can overcome the long-term economic deflation that has taken place since the mid-1990s. The large-scale econometric model and its analysis have been important for planning several policy measures and examining the economic structure of a country. However, it seems that the development and maintenance of the KLM would be very costly. The book discusses how the KLM is developed and employed for the policy analyses.
Technical analysis points out that the best source of information to beat the market is the price itself. Introducing readers to technical analysis in a more succinct and practical way, Ramlall focuses on the key aspects, benefits, drawbacks, and the main tools of technical analysis. Chart Patterns, Point & Figure, Stochastics, Sentiment indicators, Elliot Wave Theory, RSI, R, Candlesticks and more are covered, including both the concepts and the practical applications. Also including programming technical analysis tools, this book is a valuable tool for both researchers and practitioners.
It is impossible to understand modern economics without knowledge of the basic tools of gametheory and mechanism design. This book provides a graduate-level introduction to the economic modeling of strategic behavior. The goal is to teach Economics doctoral students the tools of game theory and mechanism design that all economists should know.
A Guide to Modern Econometrics, Fifth Edition has become established as a highly successful textbook. It serves as a guide to alternative techniques in econometrics with an emphasis on intuition and the practical implementation of these approaches. This fifth edition builds upon the success of its predecessors. The text has been carefully checked and updated, taking into account recent developments and insights. It includes new material on casual inference, the use and limitation of p-values, instrumental variables estimation and its implementation, regression discontinuity design, standardized coefficients, and the presentation of estimation results.
Volumes 45a and 45b of Advances in Econometrics honor Joon Y. Park, Wisnewsky Professor of Human Studies and Professor of Economics at Indiana University. Professor Park has made numerous and substantive contributions to the field of econometrics since beginning his academic career in the mid-1980s and has held positions at Cornell University, University of Toronto, Seoul National University, Rice University, Texas A&M University, and Sungkyunkwan University. This first volume, Essays in Honor of Joon Y. Park: Econometric Theory, features contributions to econometric theory related to Professor Park’s analysis of time series and particularly related to the research of the first two or so decades of his career.
Originally published in 1971, this is a rigorous analysis of the economic aspects of the efficiency of public enterprises at the time. The author first restates and extends the relevant parts of welfare economics, and then illustrates its application to particular cases, drawing on the work of the National Board for Prices and Incomes, of which he was Deputy Chairman. The analysis is developed stage by stage, with the emphasis on applicability and ease of comprehension, rather than on generality or mathematical elegance. Financial performance, the second-best, the optimal degree of complexity of price structures and problems of optimal quality are first discussed in a static framework. Time is next introduced, leading to a marginal cost concept derived from a multi-period optimizing model. The analysis is then related to urban transport, shipping, gas and coal. This is likely to become a standard work of more general scope than the authors earlier book on electricity supply. It rests, however, on a similar combination of economic theory and high-level experience of the real problems of public enterprises.
In this testament to the distinguished career of H.S. Houthakker a number of Professor Houthakker's friends, former colleagues and former students offer essays which build upon and extend his many contributions to economics in aggregation, consumption, growth and trade. Among the many distinguished contributors are Paul Samuelson, Werner Hildenbrand, John Muellbauer and Lester Telser. The book also includes four previously unpublished papers and notes by its distinguished dedicatee.
The main purpose of this book is to resolve deficiencies and limitations that currently exist when using Technical Analysis (TA). Particularly, TA is being used either by academics as an "economic test" of the weak-form Efficient Market Hypothesis (EMH) or by practitioners as a main or supplementary tool for deriving trading signals. This book approaches TA in a systematic way utilizing all the available estimation theory and tests. This is achieved through the developing of novel rule-based pattern recognizers, and the implementation of statistical tests for assessing the importance of realized returns. More emphasis is given to technical patterns where subjectivity in their identification process is apparent. Our proposed methodology is based on the algorithmic and thus unbiased pattern recognition. The unified methodological framework presented in this book can serve as a benchmark for both future academic studies that test the null hypothesis of the weak-form EMH and for practitioners that want to embed TA within their trading/investment decision making processes.
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
An insightful and up-to-date study of the use of periodic models in the description and forecasting of economic data. Incorporating recent developments in the field, the authors investigate such areas as seasonal time series; periodic time series models; periodic integration; and periodic cointegration. The analysis from the inclusion of many new empirical examples and results. Advanced Texts in Econometrics is a distinguished and rapidly expanding series in which leading econometricians assess recent developments in such areas as stochastic probability, panel and time series data analysis, modeling, and cointegration. In both hardback and affordable paperback, each volume explains the nature and applicability of a topic in greater depth than possible in introductory textbooks or single journal articles. Each definitive work is formatted to be as accessible and convenient for those who are not familiar with the detailed primary literature.
"In this book, Peter Bogetoft - THE expert on the theory and practice of benchmarking - provides an in-depth yet very accessible and readable explanation of the best way to do benchmarking, starting from the ground up." Rick Antle William S. Beinecke Professor of Accounting, Yale School of Management CFO, Compensation Valuation, Inc. "I highly recommend this well-written and comprehensive book on measuring and managing performance. Dr. Bogetoft summarizes the fundamental mathematical concepts in an elegant, intuitive, and understandable way." Jon A. Chilingerian Professor, Brandeis University and INSEAD "Bogetoft gives in his book Performance Benchmarking an excellent introduction to the methodological basis of benchmarking." Christian Parbol Director, DONG Energy "This book is the primer on benchmarking for performance management." Albert Birck Business Performance Manager, Maersk Oil "This excellent book provides a non technical introduction for performance management." Misja Mikkers, Director, Dutch Health Care Authority "With this very well written and comprehensive introduction to the many facets of benchmarking in hand, organizations have no excuse for not applying the best and cost effective benchmarking methods in their performance assessments." Stig P. Christensen Senior R&D Director, COWI
The theme of this book is health outcomes in India, in particular to outcomes relating to its caste and religious groups and, within these groups, to their women and children. The book's tenor is analytical and based upon a rigorous examination of recent data from both government and non-government sources. The major areas covered are sanitation, use by mothers of the government's child development services, child malnutrition, deaths in families, gender discrimination, and the measurement of welfare. |
You may like...
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
The Oxford Handbook of Applied Bayesian…
Anthony O'Hagan, Mike West
Hardcover
R4,188
Discovery Miles 41 880
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|