Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This is Volume 24 of the monograph series International Symposia in Economic Theory and Econometrics. ISETE publishes proceedings of conferences and symposia, as well as research monographs of the highest quality and importance. All articles published in these volumes are refereed relative to the standards of the best journals, therefore not all papers presented at related symposia are published in these proceedings volumes. The topics chosen for these volumes are those of particular research importance at the time of the selection of the topic.
Using data from the World Values Survey, this book sheds light on the link between happiness and the social group to which one belongs. The work is based on a rigorous statistical analysis of differences in the probability of happiness and life satisfaction between the predominant social group and subordinate groups. The cases of India and South Africa receive deep attention in dedicated chapters on cast and race, with other chapters considering issues such as cultural bias, religion, patriarchy, and gender. An additional chapter offers a global perspective. On top of this, the longitudinal nature of the data facilitates an examination of how world happiness has evolved between 1994 and 2014. This book will be a valuable reference for advanced students, scholars and policymakers involved in development economics, well-being, development geography, and sociology.
The 2008 credit crisis started with the failure of one large bank: Lehman Brothers. Since then the focus of both politicians and regulators has been on stabilising the economy and preventing future financial instability. At this juncture, we are at the last stage of future-proofing the financial sector by raising capital requirements and tightening financial regulation. Now the policy agenda needs to concentrate on transforming the banking sector into an engine for growth. Reviving competition in the banking sector after the state interventions of the past years is a key step in this process. This book introduces and explains a relatively new concept in competition measurement: the performance-conduct-structure (PCS) indicator. The key idea behind this measure is that a firm's efficiency is more highly rewarded in terms of market share and profit, the stronger competitive pressure is. The book begins by explaining the financial market's fundamental obstacles to competition presenting a brief survey of the complex relationship between financial stability and competition. The theoretical contributions of Hay and Liu and Boone provide the theoretical underpinning for the PCS indicator, while its application to banking and insurance illustrates its empirical qualities. Finally, this book presents a systematic comparison between the results of this approach and (all) existing methods as applied to 46 countries, over the same sample period. This book presents a comprehensive overview of the knowns and unknowns of financial sector competition for commercial and central bankers, policy-makers, supervisors and academics alike.
This book aims to fill the gap between panel data econometrics textbooks, and the latest development on 'big data', especially large-dimensional panel data econometrics. It introduces important research questions in large panels, including testing for cross-sectional dependence, estimation of factor-augmented panel data models, structural breaks in panels and group patterns in panels. To tackle these high dimensional issues, some techniques used in Machine Learning approaches are also illustrated. Moreover, the Monte Carlo experiments, and empirical examples are also utilised to show how to implement these new inference methods. Large-Dimensional Panel Data Econometrics: Testing, Estimation and Structural Changes also introduces new research questions and results in recent literature in this field.
Co-integration, equilibrium and equilibrium correction are key
concepts in modern applications of econometrics to real world
problems. This book provides direction and guidance to the now vast
literature facing students and graduate economists. Econometric
theory is linked to practical issues such as how to identify
equilibrium relationships, how to deal with structural breaks
associated with regime changes and what to do when variables are of
different orders of integration.
This book addresses one of the most important research activities in empirical macroeconomics. It provides a course of advanced but intuitive methods and tools enabling the spatial and temporal disaggregation of basic macroeconomic variables and the assessment of the statistical uncertainty of the outcomes of disaggregation. The empirical analysis focuses mainly on GDP and its growth in the context of Poland. However, all of the methods discussed can be easily applied to other countries. The approach used in the book views spatial and temporal disaggregation as a special case of the estimation of missing observations (a topic on missing data analysis). The book presents an econometric course of models of Seemingly Unrelated Regression Equations (SURE). The main advantage of using the SURE specification is to tackle the presented research problem so that it allows for the heterogeneity of the parameters describing relations between macroeconomic indicators. The book contains model specification, as well as descriptions of stochastic assumptions and resulting procedures of estimation and testing. The method also addresses uncertainty in the estimates produced. All of the necessary tests and assumptions are presented in detail. The results are designed to serve as a source of invaluable information making regional analyses more convenient and - more importantly - comparable. It will create a solid basis for making conclusions and recommendations concerning regional economic policy in Poland, particularly regarding the assessment of the economic situation. This is essential reading for academics, researchers, and economists with regional analysis as their field of expertise, as well as central bankers and policymakers.
Market Analysis for Real Estate is a comprehensive introduction to how real estate markets work and the analytical tools and techniques that can be used to identify and interpret market signals. The markets for space and varied property assets, including residential, office, retail, and industrial, are presented, analyzed, and integrated into a complete understanding of the role of real estate markets within the workings of contemporary urban economies. Unlike other books on market analysis, the economic and financial theory in this book is rigorous and well integrated with the specifics of the real estate market. Furthermore, it is thoroughly explained as it assumes no previous coursework in economics or finance on the part of the reader. The theoretical discussion is backed up with numerous real estate case study examples and problems, which are presented throughout the text to assist both student and teacher. Including discussion questions, exercises, several web links, and online slides, this textbook is suitable for use on a variety of degree programs in real estate, finance, business, planning, and economics at undergraduate and MSc/MBA level. It is also a useful primer for professionals in these disciplines.
This open access book focuses on the concepts, tools and techniques needed to successfully model ever-changing time-series data. It emphasizes the need for general models to account for the complexities of the modern world and how these can be applied to a range of issues facing Earth, from modelling volcanic eruptions, carbon dioxide emissions and global temperatures, to modelling unemployment rates, wage inflation and population growth. Except where otherwise noted, this book is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0.
Emphasizing the impact of computer software and computational technology on econometric theory and development, this text presents recent advances in the application of computerized tools to econometric techniques and practicesaEURO"focusing on current innovations in Monte Carlo simulation, computer-aided testing, model selection, and Bayesian methodology for improved econometric analyses.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
The estimation of the effects of treatments ??? endogenous
variables representing everything from individual participation in
a training program to national participation in a World Bank loan
program ??? has occupied much of the theoretical and applied
econometric research literatures in recent years. This volume
brings together a diverse collection of papers on this important
topic by leaders in the field from around the world. Some of the
papers offer new theoretical contributions on various estimation
techniques and others provide timely empirical applications
illustrating the benefits of these and other methods. All of the
papers share two common themes. First, as different estimators
estimate different treatment effect parameters, it is vital to know
what you are estimating and to know to whom the estimate applies.
Second, as different estimators require different identification
assumptions, it is crucial to understand the assumptions underlying
each estimator. In empirical applications, the researcher must also
make the case that the assumptions hold based on the available data
and the institutional context. The theoretical contributions range
over a variety of different estimators drawn from both statistics
and econometrics, including matching and other non-parametric
methods, panel methods, instrumental variables, methods based on
hazard rate models and principal stratification, and they draw upon
both the Bayesian and classical statistical traditions. The
empirical contributions focus mainly on the evaluation of active
labor market programs in Europe and the United States, but also
examine of the effect of parenthood on wages and of the number of
children on child health.
This book addresses the disparities that arise when measuring and modeling societal behavior and progress across the social sciences. It looks at why and how different disciplines and even researchers can use the same data and yet come to different conclusions about equality of opportunity, economic and social mobility, poverty and polarization, and conflict and segregation. Because societal behavior and progress exist only in the context of other key aspects, modeling becomes exponentially more complex as more of these aspects are factored into considerations. The content of this book transcends disciplinary boundaries, providing valuable information on measuring and modeling to economists, sociologists, and political scientists who are interested in data-based analysis of pressing social issues.
Tools to improve decision making in an imperfect world The publication has been developed and fine- tuned through a
decade of classroom experience, and readers will find the author's
approach very engaging and accessible. There are nearly 200
examples and exercises to help readers see how effective use of
Bayesian statistics enables them to make optimal decisions. MATLAB?
and R computer programs are integrated throughout the book. An
accompanying Web site provides readers with computer code for many
examples and datasets.
Written to complement the second edition of best-selling textbook Introductory Econometrics for Finance, this book provides a comprehensive introduction to the use of the Regression Analysis of Time Series (RATS) software for modelling in finance and beyond. It provides numerous worked examples with carefully annotated code and detailed explanations of the outputs, giving readers the knowledge and confidence to use the software for their own research and to interpret their own results. A wide variety of important modelling approaches are covered, including such topics as time-series analysis and forecasting, volatility modelling, limited dependent variable and panel methods, switching models and simulations methods. The book is supported by an accompanying website containing freely downloadable data and RATS instructions.
This two volume set is a comprehensive collection of historical and contemporary articles which highlight the theoretical foundations and the methods and models of long wave analysis. After examining the beginnings of long wave theory, the book includes discussions of time series methods and non-linear modelling, with an exploration of economic development in its historical context. It investigates the process of evolution and mutation in industrial capitalism over the last two hundred years. Contemporary reviews and critiques of long wave theory are also included. It makes available for the first time much important material that has hitherto been inaccessible. The book will be of immense value to all students and scholars interested in the history of economic thought, time series analysis and evolutionary or institutionalist analysis.
The combined efforts of the Physicists and the Economists in recent years in a- lyzing and modeling various dynamic phenomena in monetary and social systems have led to encouragingdevelopments,generally classi?ed under the title of Eco- physics. These developmentsshare a commonambitionwith the alreadyestablished ?eld of Quantitative Economics. This volume intends to offer the reader a glimpse of these two parallel initiatives by collecting review papers written by well-known experts in the respective research frontiers in one cover. This massive book presents a unique combination of research papers contributed almost equally by Physicists and Economists. Additional contributions from C- puter Scientists and Mathematicians are also included in this volume. It consists of two parts: The ?rst part concentrates on econophysics of games and social choices and is the proceedings of the Econophys-Kolkata IV workshop held at the Indian Statistical Institute and the Saha Institute of Nuclear Physics, both in Kolkata, d- ing March 9-13, 2009. The second part consists of contributionsto quantitative e- nomics by experts in connection with the Platinum Jubilee celebration of the Indian Statistical Institute. In this connectiona Forewordfor the volume, written by Sankar K. Pal, Director of the Indian Statistical Institute, is put forth. Both parts specialize mostly on frontier problems in games and social choices. The?rst partofthebookdealswith severalrecentdevelopmentsineconophysics. Game theory is integral to the formulation of modern economic analysis. Often games display a situation where the social optimal could not be reached as a - sult of non co-operation between different agents.
This second volume of the late Julian Simon's articles and essays continues the theme of volume one in presenting unorthodox and controversial approaches to many fields in economics.The book features a wide range of papers divided into eight parts with a biographical introduction to the author's career and intellectual development as well as personal revelations about his background. Part One contains essays on statistics and probability which are developed in the second section on theoretical and applied econometrics. The third part considers individual behavior, including discussion of the effects of income on suicide rates and successive births, and foster care. Parts four and five present papers on population and migration, for which the author is best known. The sixth part contains Professor Simon's controversial discussion of natural resources and the articles in part seven relate to welfare analysis. In the final part some of the author's previously unpublished papers are presented, including discussions on duopoly and economists' thinking. Like the first volume this collection will be of interest to academics and students welcoming controversial and unorthodox approaches to a wide variety of theories and concepts in economics.
Standard methods for estimating empirical models in economics and many other fields rely on strong assumptions about functional forms and the distributions of unobserved random variables. Often, it is assumed that functions of interest are linear or that unobserved random variables are normally distributed. Such assumptions simplify estimation and statistical inference but are rarely justified by economic theory or other a priori considerations. Inference based on convenient but incorrect assumptions about functional forms and distributions can be highly misleading. Nonparametric and semiparametric statistical methods provide a way to reduce the strength of the assumptions required for estimation and inference, thereby reducing the opportunities for obtaining misleading results. These methods are applicable to a wide variety of estimation problems in empirical economics and other fields, and they are being used in applied research with increasing frequency. The literature on nonparametric and semiparametric estimation is large and highly technical. This book presents the main ideas underlying a variety of nonparametric and semiparametric methods. It is accessible to graduate students and applied researchers who are familiar with econometric and statistical theory at the level taught in graduate-level courses in leading universities. The book emphasizes ideas instead of technical details and provides as intuitive an exposition as possible. Empirical examples illustrate the methods that are presented. This book updates and greatly expands the author's previous book on semiparametric methods in econometrics. Nearly half of the material is new.
It is commonly believed that macroeconomic models are not useful for policy analysis because they do not take proper account of agents' expectations. Over the last decade, mainstream macroeconomic models in the UK and elsewhere have taken on board the Rational Expectations Revolution' by explicitly incorporating expectations of the future. In principle, one can perform the same technical exercises on a forward expectations model as on a conventional model -- and more! Rational Expectations in Macroeconomic Models deals with the numerical methods necessary to carry out policy analysis and forecasting with these models. These methods are often passed on by word of mouth or confined to obscure journals. Rational Expectations in Macroeconomic Models brings them together with applications which are interesting in their own right. There is no comparable textbook in the literature. The specific subjects include: (i) solving for model consistent expectations; (ii) the choice of terminal condition and time horizon; (iii) experimental design: i.e., the effect of temporary vs permanent, anticipated vs. unanticipated shocks; deterministic vs. stochastic, dynamic vs. static simulation; (iv) the role of exchange rate; (v) optimal control and inflation-output tradeoffs. The models used are those of the Liverpool Research Group in Macroeconomics, the London Business School and the National Institute of Economic and Social Research.
A Guide to Modern Econometrics, Fifth Edition has become established as a highly successful textbook. It serves as a guide to alternative techniques in econometrics with an emphasis on intuition and the practical implementation of these approaches. This fifth edition builds upon the success of its predecessors. The text has been carefully checked and updated, taking into account recent developments and insights. It includes new material on casual inference, the use and limitation of p-values, instrumental variables estimation and its implementation, regression discontinuity design, standardized coefficients, and the presentation of estimation results.
This ambitious book looks 'behind the model' to reveal how economists use formal models to generate insights into the economy. Drawing on recent work in the philosophy of science and economic methodology, the book presents a novel framework for understanding the logic of economic modeling. It also reveals the ways in which economic models can mislead rather than illuminate. Importantly, the book goes beyond purely negative critique, proposing a concrete program of methodological reform to better equip economists to detect potential mismatches between their models and the targets of their inquiry. Ranging across economics, philosophy, and social science methods, and drawing on a variety of examples, including the recent financial crisis, Behind the Model will be of interest to anyone who has wondered how economics works - and why it sometimes fails so spectacularly.
Financial crises often transmit across geographical borders and different asset classes. Modeling these interactions is empirically challenging, and many of the proposed methods give different results when applied to the same data sets. In this book the authors set out their work on a general framework for modeling the transmission of financial crises using latent factor models. They show how their framework encompasses a number of other empirical contagion models and why the results between the models differ. The book builds a framework which begins from considering contagion in the bond markets during 1997-1998 across a number of countries, and culminates in a model which encompasses multiple assets across multiple countries through over a decade of crisis events from East Asia in 1997-1998 to the sub prime crisis during 2008. Program code to support implementation of similar models is available.
First published in 1992, The Efficiency of New Issue Markets provides a comprehensive overview of under-pricing and through this assess the efficiency of new issue markets. The book provides a further theoretical development of the adverse selection model of the new issue market and addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets which demonstrates the importance of distribution in determining relative efficiency.
This book explores the possibility of using social media data for detecting socio-economic recovery activities. In the last decade, there have been intensive research activities focusing on social media during and after disasters. This approach, which views people's communication on social media as a sensor for real-time situations, has been widely adopted as the "people as sensor" approach. Furthermore, to improve recovery efforts after large-scale disasters, detecting communities' real-time recovery situations is essential, since conventional socio-economic recovery indicators, such as governmental statistics, are not published in real time. Thanks to its timeliness, using social media data can fill the gap. Motivated by this possibility, this book especially focuses on the relationships between people's communication on Twitter and Facebook pages, and socio-economic recovery activities as reflected in the used-car market data and the housing market data in the case of two major disasters: the Great East Japan Earthquake and Tsunami of 2011 and Hurricane Sandy in 2012. The book pursues an interdisciplinary approach, combining e.g. disaster recovery studies, crisis informatics, and economics. In terms of its contributions, firstly, the book sheds light on the "people as sensors" approach for detecting socio-economic recovery activities, which has not been thoroughly studied to date but has the potential to improve situation awareness during the recovery phase. Secondly, the book proposes new socio-economic recovery indicators: used-car market data and housing market data. Thirdly, in the context of using social media during the recovery phase, the results demonstrate the importance of distinguishing between social media data posted both by people who are at or near disaster-stricken areas and by those who are farther away.
The volatility of financial returns changes over time and, for the last thirty years, Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models have provided the principal means of analyzing, modeling, and monitoring such changes. Taking into account that financial returns typically exhibit heavy tails that is, extreme values can occur from time to time Andrew Harvey's new book shows how a small but radical change in the way GARCH models are formulated leads to a resolution of many of the theoretical problems inherent in the statistical theory. The approach can also be applied to other aspects of volatility, such as those arising from data on the range of returns and the time between trades. Furthermore, the more general class of Dynamic Conditional Score models extends to robust modeling of outliers in the levels of time series and to the treatment of time-varying relationships. As such, there are applications not only to financial data but also to macroeconomic time series and to time series in other disciplines. The statistical theory draws on basic principles of maximum likelihood estimation and, by doing so, leads to an elegant and unified treatment of nonlinear time-series modeling. The practical value of the proposed models is illustrated by fitting them to real data sets." |
You may like...
Evolutionary and Deterministic Methods…
Esther Andres Perez, Leo M. Gonzalez, …
Hardcover
Introduction to the Theory of Games…
Ferenc Forgo, Jeno Szep, …
Hardcover
R4,442
Discovery Miles 44 420
International Conference on Artificial…
Garima Mathur, Harish Sharma, …
Hardcover
R5,637
Discovery Miles 56 370
Game Theory and Its Applications
Akio Matsumoto, Ferenc Szidarovszky
Hardcover
R3,762
Discovery Miles 37 620
|