Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
These essays honor Professor Peter C.B. Phillips of Yale University and his many contributions to the field of econometrics. Professor Phillips's research spans many topics in econometrics including: non-stationary time series and panel models partial identification and weak instruments Bayesian model evaluation and prediction financial econometrics and finite-sample statistical methods and results. The papers in this volume reflect additions to and amplifications of many of Professor Phillips' research contributions. Some of the topics discussed in the volume include panel macro-econometric modeling, efficient estimation and inference in difference-in-difference models, limiting and empirical distributions of IV estimates when some of the instruments are endogenous, the use of stochastic dominance techniques to examine conditional wage distributions of incumbents and newly hired employees, long-horizon predictive tests in financial markets, new developments in information matrix testing, testing for co-integration in Markov switching error correction models, and deviation information criteria for comparing vector autoregressive models.
The volume contains articles that should appeal to readers with computational, modeling, theoretical, and applied interests. Methodological issues include parallel computation, Hamiltonian Monte Carlo, dynamic model selection, small sample comparison of structural models, Bayesian thresholding methods in hierarchical graphical models, adaptive reversible jump MCMC, LASSO estimators, parameter expansion algorithms, the implementation of parameter and non-parameter-based approaches to variable selection, a survey of key results in objective Bayesian model selection methodology, and a careful look at the modeling of endogeneity in discrete data settings. Important contemporary questions are examined in applications in macroeconomics, finance, banking, labor economics, industrial organization, and transportation, among others, in which model uncertainty is a central consideration.
Microsimulation Modelling involves the application of simulation methods to micro data for the purposes of evaluating the effectiveness and improving the design of public policy. The field has existed for over 50 years and has been applied to many different policy areas and is a methodology that is applied within both government and academia. This handbook brings together leading authors in the field to describe and discuss the main current issues within the field. The handbook provides an overview of current developments across each of the sub-fields of microsimulation modelling such as tax-benefit, pensions, spatial, health, labour, consumption, transport and land use policy as well as macro-micro, environmental and demographic issues. It focuses also on the modelling different micro units such as households, firms and farms. Each chapter discusses its sub-field under the following headings: the main methodologies of the sub-field; survey the literature in the area; critique the literature; and propose future directions for research within the sub-field.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Applied econometrics, known to aficionados as 'metrics, is the original data science. 'Metrics encompasses the statistical methods economists use to untangle cause and effect in human affairs. Through accessible discussion and with a dose of kung fu-themed humor, Mastering 'Metrics presents the essential tools of econometric research and demonstrates why econometrics is exciting and useful. The five most valuable econometric methods, or what the authors call the Furious Five--random assignment, regression, instrumental variables, regression discontinuity designs, and differences in differences--are illustrated through well-crafted real-world examples (vetted for awesomeness by Kung Fu Panda's Jade Palace). Does health insurance make you healthier? Randomized experiments provide answers. Are expensive private colleges and selective public high schools better than more pedestrian institutions? Regression analysis and a regression discontinuity design reveal the surprising truth. When private banks teeter, and depositors take their money and run, should central banks step in to save them? Differences-in-differences analysis of a Depression-era banking crisis offers a response. Could arresting O. J. Simpson have saved his ex-wife's life? Instrumental variables methods instruct law enforcement authorities in how best to respond to domestic abuse. Wielding econometric tools with skill and confidence, Mastering 'Metrics uses data and statistics to illuminate the path from cause to effect. * Shows why econometrics is important* Explains econometric research through humorous and accessible discussion* Outlines empirical methods central to modern econometric practice* Works through interesting and relevant real-world examples
This book addresses the disparities that arise when measuring and modeling societal behavior and progress across the social sciences. It looks at why and how different disciplines and even researchers can use the same data and yet come to different conclusions about equality of opportunity, economic and social mobility, poverty and polarization, and conflict and segregation. Because societal behavior and progress exist only in the context of other key aspects, modeling becomes exponentially more complex as more of these aspects are factored into considerations. The content of this book transcends disciplinary boundaries, providing valuable information on measuring and modeling to economists, sociologists, and political scientists who are interested in data-based analysis of pressing social issues.
The estimation of the effects of treatments ??? endogenous
variables representing everything from individual participation in
a training program to national participation in a World Bank loan
program ??? has occupied much of the theoretical and applied
econometric research literatures in recent years. This volume
brings together a diverse collection of papers on this important
topic by leaders in the field from around the world. Some of the
papers offer new theoretical contributions on various estimation
techniques and others provide timely empirical applications
illustrating the benefits of these and other methods. All of the
papers share two common themes. First, as different estimators
estimate different treatment effect parameters, it is vital to know
what you are estimating and to know to whom the estimate applies.
Second, as different estimators require different identification
assumptions, it is crucial to understand the assumptions underlying
each estimator. In empirical applications, the researcher must also
make the case that the assumptions hold based on the available data
and the institutional context. The theoretical contributions range
over a variety of different estimators drawn from both statistics
and econometrics, including matching and other non-parametric
methods, panel methods, instrumental variables, methods based on
hazard rate models and principal stratification, and they draw upon
both the Bayesian and classical statistical traditions. The
empirical contributions focus mainly on the evaluation of active
labor market programs in Europe and the United States, but also
examine of the effect of parenthood on wages and of the number of
children on child health.
The problem of disparities between different estimates of GDP is, according to this text, well-known and widely discussed. Here, the authors describe a method for examining the discrepancies using a technique allocating them with reference to data reliability. The method enhances the reliability of the underlying data and leads to maximum-likelihood estimates. It is illustrated by application to the UK national accounts for the period 1920-1990. The book includes a full set of estimates for this period, including runs of industrial data for the period 1948-1990 which are longer than those available from any other source. The statistical technique allows estimates of standard errors of the data to be calculated and verified; these are presented both for data in levels and for changes in variables over one-, two- and five-year periods. A disk with the dataset in machine readable form is available separately.
In this compelling 1995 book, David Hendry and Mary Morgan bring together the classic papers of the pioneer econometricians. Together, these papers form the foundations of econometric thought. They are essential reading for anyone seeking to understand the aims, method and methodology of econometrics and the development of this statistical approach in economics. However, because they are technically straightforward, the book is also accessible to students and non-specialists. An editorial commentary places the readings in their historical context and indicates the continuing relevance of these early, yet highly sophisticated, works for current econometric analysis. While this book provides a companion volume to Mary Morgan's acclaimed The History of Econometric Ideas, the editors' commentary both adds to that earlier volume and also provides a stand-alone and synthetic account of the development of econometrics.
For one-semester business statistics courses. A focus on using statistical methods to analyse and interpret results to make data-informed business decisions Statistics is essential for all business majors, and Business Statistics: A First Course helps students see the role statistics will play in their own careers by providing examples drawn from all functional areas of business. Guided by the principles set forth by major statistical and business science associations (ASA and DSI), plus the authors' diverse experiences, the 8th Edition, Global Edition, continues to innovate and improve the way this course is taught to all students. With new examples, case scenarios, and problems, the text continues its tradition of focusing on the interpretation of results, evaluation of assumptions, and discussion of next steps that lead to data-informed decision making. The authors feel that this approach, rather than a focus on manual calculations, better serves students in their future careers. This brief offering, created to fit the needs of a one-semester course, is part of the established Berenson/Levine series.
This two volume set is a comprehensive collection of historical and contemporary articles which highlight the theoretical foundations and the methods and models of long wave analysis. After examining the beginnings of long wave theory, the book includes discussions of time series methods and non-linear modelling, with an exploration of economic development in its historical context. It investigates the process of evolution and mutation in industrial capitalism over the last two hundred years. Contemporary reviews and critiques of long wave theory are also included. It makes available for the first time much important material that has hitherto been inaccessible. The book will be of immense value to all students and scholars interested in the history of economic thought, time series analysis and evolutionary or institutionalist analysis.
The combined efforts of the Physicists and the Economists in recent years in a- lyzing and modeling various dynamic phenomena in monetary and social systems have led to encouragingdevelopments,generally classi?ed under the title of Eco- physics. These developmentsshare a commonambitionwith the alreadyestablished ?eld of Quantitative Economics. This volume intends to offer the reader a glimpse of these two parallel initiatives by collecting review papers written by well-known experts in the respective research frontiers in one cover. This massive book presents a unique combination of research papers contributed almost equally by Physicists and Economists. Additional contributions from C- puter Scientists and Mathematicians are also included in this volume. It consists of two parts: The ?rst part concentrates on econophysics of games and social choices and is the proceedings of the Econophys-Kolkata IV workshop held at the Indian Statistical Institute and the Saha Institute of Nuclear Physics, both in Kolkata, d- ing March 9-13, 2009. The second part consists of contributionsto quantitative e- nomics by experts in connection with the Platinum Jubilee celebration of the Indian Statistical Institute. In this connectiona Forewordfor the volume, written by Sankar K. Pal, Director of the Indian Statistical Institute, is put forth. Both parts specialize mostly on frontier problems in games and social choices. The?rst partofthebookdealswith severalrecentdevelopmentsineconophysics. Game theory is integral to the formulation of modern economic analysis. Often games display a situation where the social optimal could not be reached as a - sult of non co-operation between different agents.
This book aims to bring together studies using different data types (panel data, cross-sectional data and time series data) and different methods (for example, panel regression, nonlinear time series, chaos approach, deep learning, machine learning techniques among others) and to create a source for those interested in these topics and methods by addressing some selected applied econometrics topics which have been developed in recent years. It creates a common meeting ground for scientists who give econometrics education in Turkey to study, and contribute to the delivery of the authors' knowledge to the people who take interest. This book can also be useful for "Applied Economics and Econometrics" courses in postgraduate education as a material source
This second volume of the late Julian Simon's articles and essays continues the theme of volume one in presenting unorthodox and controversial approaches to many fields in economics.The book features a wide range of papers divided into eight parts with a biographical introduction to the author's career and intellectual development as well as personal revelations about his background. Part One contains essays on statistics and probability which are developed in the second section on theoretical and applied econometrics. The third part considers individual behavior, including discussion of the effects of income on suicide rates and successive births, and foster care. Parts four and five present papers on population and migration, for which the author is best known. The sixth part contains Professor Simon's controversial discussion of natural resources and the articles in part seven relate to welfare analysis. In the final part some of the author's previously unpublished papers are presented, including discussions on duopoly and economists' thinking. Like the first volume this collection will be of interest to academics and students welcoming controversial and unorthodox approaches to a wide variety of theories and concepts in economics.
Standard methods for estimating empirical models in economics and many other fields rely on strong assumptions about functional forms and the distributions of unobserved random variables. Often, it is assumed that functions of interest are linear or that unobserved random variables are normally distributed. Such assumptions simplify estimation and statistical inference but are rarely justified by economic theory or other a priori considerations. Inference based on convenient but incorrect assumptions about functional forms and distributions can be highly misleading. Nonparametric and semiparametric statistical methods provide a way to reduce the strength of the assumptions required for estimation and inference, thereby reducing the opportunities for obtaining misleading results. These methods are applicable to a wide variety of estimation problems in empirical economics and other fields, and they are being used in applied research with increasing frequency. The literature on nonparametric and semiparametric estimation is large and highly technical. This book presents the main ideas underlying a variety of nonparametric and semiparametric methods. It is accessible to graduate students and applied researchers who are familiar with econometric and statistical theory at the level taught in graduate-level courses in leading universities. The book emphasizes ideas instead of technical details and provides as intuitive an exposition as possible. Empirical examples illustrate the methods that are presented. This book updates and greatly expands the author's previous book on semiparametric methods in econometrics. Nearly half of the material is new.
It is commonly believed that macroeconomic models are not useful for policy analysis because they do not take proper account of agents' expectations. Over the last decade, mainstream macroeconomic models in the UK and elsewhere have taken on board the Rational Expectations Revolution' by explicitly incorporating expectations of the future. In principle, one can perform the same technical exercises on a forward expectations model as on a conventional model -- and more! Rational Expectations in Macroeconomic Models deals with the numerical methods necessary to carry out policy analysis and forecasting with these models. These methods are often passed on by word of mouth or confined to obscure journals. Rational Expectations in Macroeconomic Models brings them together with applications which are interesting in their own right. There is no comparable textbook in the literature. The specific subjects include: (i) solving for model consistent expectations; (ii) the choice of terminal condition and time horizon; (iii) experimental design: i.e., the effect of temporary vs permanent, anticipated vs. unanticipated shocks; deterministic vs. stochastic, dynamic vs. static simulation; (iv) the role of exchange rate; (v) optimal control and inflation-output tradeoffs. The models used are those of the Liverpool Research Group in Macroeconomics, the London Business School and the National Institute of Economic and Social Research.
First published in 1992, The Efficiency of New Issue Markets provides a comprehensive overview of under-pricing and through this assess the efficiency of new issue markets. The book provides a further theoretical development of the adverse selection model of the new issue market and addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets which demonstrates the importance of distribution in determining relative efficiency.
Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Ito integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addition, the author discusses net premiums for various insurance policies. Mathematical statements are carefully and clearly formulated and proved while avoiding excessive technicalities as much as possible. The book illustrates how these statements help solve numerous statistical problems. It also includes more than 70 exercises.
This book presents estimates of the sources of economic growth in Canada. The experimental measures account for the reproducibility of capital inputs in an input-output framework and show that advances in technology are more important for economic growth than previously estimated. Traditional measures of multifactor productivity advance are also presented. Extensive comparisons relate the two approaches to each change and labour productivity. The book will be of interest to macroeconomists studying economic growth, capital accumulation, technical advance, growth accounting, and input-output analysis.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. Financial Mathematics, Volatility and Covariance Modelling: Volume 2 provides a key repository on the current state of knowledge, the latest debates and recent literature on financial mathematics, volatility and covariance modelling. The first section is devoted to mathematical finance, stochastic modelling and control optimization. Chapters explore the recent financial crisis, the increase of uncertainty and volatility, and propose an alternative approach to deal with these issues. The second section covers financial volatility and covariance modelling and explores proposals for dealing with recent developments in financial econometrics This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
This book explores the possibility of using social media data for detecting socio-economic recovery activities. In the last decade, there have been intensive research activities focusing on social media during and after disasters. This approach, which views people's communication on social media as a sensor for real-time situations, has been widely adopted as the "people as sensor" approach. Furthermore, to improve recovery efforts after large-scale disasters, detecting communities' real-time recovery situations is essential, since conventional socio-economic recovery indicators, such as governmental statistics, are not published in real time. Thanks to its timeliness, using social media data can fill the gap. Motivated by this possibility, this book especially focuses on the relationships between people's communication on Twitter and Facebook pages, and socio-economic recovery activities as reflected in the used-car market data and the housing market data in the case of two major disasters: the Great East Japan Earthquake and Tsunami of 2011 and Hurricane Sandy in 2012. The book pursues an interdisciplinary approach, combining e.g. disaster recovery studies, crisis informatics, and economics. In terms of its contributions, firstly, the book sheds light on the "people as sensors" approach for detecting socio-economic recovery activities, which has not been thoroughly studied to date but has the potential to improve situation awareness during the recovery phase. Secondly, the book proposes new socio-economic recovery indicators: used-car market data and housing market data. Thirdly, in the context of using social media during the recovery phase, the results demonstrate the importance of distinguishing between social media data posted both by people who are at or near disaster-stricken areas and by those who are farther away.
Prepares readers to analyze data and interpret statistical results using the increasingly popular R more quickly than other texts through LessR extensions which remove the need to program. By introducing R through less R, readers learn how to organize data for analysis, read the data into R, and produce output without performing numerous functions and programming first. Readers can select the necessary procedure and change the relevant variables without programming. Quick Starts introduce readers to the concepts and commands reviewed in the chapters. Margin notes define, illustrate, and cross-reference the key concepts. When readers encounter a term previously discussed, the margin notes identify the page number to the initial introduction. Scenarios highlight the use of a specific analysis followed by the corresponding R/lessR input and an interpretation of the resulting output. Numerous examples of output from psychology, business, education, and other social sciences demonstrate how to interpret results and worked problems help readers test their understanding. www.lessRstats.com website features the lessR program, the book's 2 data sets referenced in standard text and SPSS formats so readers can practice using R/lessR by working through the text examples and worked problems, PDF slides for each chapter, solutions to the book's worked problems, links to R/lessR videos to help readers better understand the program, and more. New to this edition: o upgraded functionality and data visualizations of the lessR package, which is now aesthetically equal to the ggplot 2 R standard o new features to replace and extend previous content, such as aggregating data with pivot tables with a simple lessR function call.
The contents of this volume comprise the proceedings of the International Symposia in Economic Theory and Econometrics conference held in 1987 at the IC^T2 (Innovation, Creativity, and Capital) Institute at the University of Texas at Austin. The essays present fundamental new research on the analysis of complicated outcomes in relatively simple macroeconomic models. The book covers econometric modelling and time series analysis techniques in five parts. Part I focuses on sunspot equilibria, the study of uncertainty generated by nonstochastic economic models. Part II examines the more traditional examples of deterministic chaos: bubbles, instability, and hyperinflation. Part III contains the most current literature dealing with empirical tests for chaos and strange attractors. Part IV deals with chaos and informational complexity. Part V, Nonlinear Econometric Modelling, includes tests for and applications of nonlinearity.
Do economics and statistics succeed in explaining human social behaviour? To answer this question. Leland Gerson Neuberg studies some pioneering controlled social experiments. Starting in the late 1960s, economists and statisticians sought to improve social policy formation with random assignment experiments such as those that provided income guarantees in the form of a negative income tax. This book explores anomalies in the conceptual basis of such experiments and in the foundations of statistics and economics more generally. Scientific inquiry always faces certain philosophical problems. Controlled experiments of human social behaviour, however, cannot avoid some methodological difficulties not evident in physical science experiments. Drawing upon several examples, the author argues that methodological anomalies prevent microeconomics and statistics from explaining human social behaviour as coherently as the physical sciences explain nature. He concludes that controlled social experiments are a frequently overrated tool for social policy improvement.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency. |
You may like...
Advanced Introduction to Spatial…
Daniel A. Griffith, Bin Li
Hardcover
R2,575
Discovery Miles 25 750
Patterns of Economic Change by State and…
Hannah Anderson Krog
Paperback
R2,827
Discovery Miles 28 270
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Kwantitatiewe statistiese tegnieke
Swanepoel Swanepoel, Vivier Vivier, …
Book
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,397
Discovery Miles 23 970
|