![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
The volume contains articles that should appeal to readers with computational, modeling, theoretical, and applied interests. Methodological issues include parallel computation, Hamiltonian Monte Carlo, dynamic model selection, small sample comparison of structural models, Bayesian thresholding methods in hierarchical graphical models, adaptive reversible jump MCMC, LASSO estimators, parameter expansion algorithms, the implementation of parameter and non-parameter-based approaches to variable selection, a survey of key results in objective Bayesian model selection methodology, and a careful look at the modeling of endogeneity in discrete data settings. Important contemporary questions are examined in applications in macroeconomics, finance, banking, labor economics, industrial organization, and transportation, among others, in which model uncertainty is a central consideration.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Emphasizing the impact of computer software and computational technology on econometric theory and development, this text presents recent advances in the application of computerized tools to econometric techniques and practicesaEURO"focusing on current innovations in Monte Carlo simulation, computer-aided testing, model selection, and Bayesian methodology for improved econometric analyses.
Tools to improve decision making in an imperfect world The publication has been developed and fine- tuned through a
decade of classroom experience, and readers will find the author's
approach very engaging and accessible. There are nearly 200
examples and exercises to help readers see how effective use of
Bayesian statistics enables them to make optimal decisions. MATLAB?
and R computer programs are integrated throughout the book. An
accompanying Web site provides readers with computer code for many
examples and datasets.
The 2008 credit crisis started with the failure of one large bank: Lehman Brothers. Since then the focus of both politicians and regulators has been on stabilising the economy and preventing future financial instability. At this juncture, we are at the last stage of future-proofing the financial sector by raising capital requirements and tightening financial regulation. Now the policy agenda needs to concentrate on transforming the banking sector into an engine for growth. Reviving competition in the banking sector after the state interventions of the past years is a key step in this process. This book introduces and explains a relatively new concept in competition measurement: the performance-conduct-structure (PCS) indicator. The key idea behind this measure is that a firm's efficiency is more highly rewarded in terms of market share and profit, the stronger competitive pressure is. The book begins by explaining the financial market's fundamental obstacles to competition presenting a brief survey of the complex relationship between financial stability and competition. The theoretical contributions of Hay and Liu and Boone provide the theoretical underpinning for the PCS indicator, while its application to banking and insurance illustrates its empirical qualities. Finally, this book presents a systematic comparison between the results of this approach and (all) existing methods as applied to 46 countries, over the same sample period. This book presents a comprehensive overview of the knowns and unknowns of financial sector competition for commercial and central bankers, policy-makers, supervisors and academics alike.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
To fully function in today's global real estate industry, students and professionals increasingly need to understand how to implement essential and cutting-edge quantitative techniques. This book presents an easy-to-read guide to applying quantitative analysis in real estate aimed at non-cognate undergraduate and masters students, and meets the requirements of modern professional practice. Through case studies and examples illustrating applications using data sourced from dedicated real estate information providers and major firms in the industry, the book provides an introduction to the foundations underlying statistical data analysis, common data manipulations and understanding descriptive statistics, before gradually building up to more advanced quantitative analysis, modelling and forecasting of real estate markets. Our examples and case studies within the chapters have been specifically compiled for this book and explicitly designed to help the reader acquire a better understanding of the quantitative methods addressed in each chapter. Our objective is to equip readers with the skills needed to confidently carry out their own quantitative analysis and be able to interpret empirical results from academic work and practitioner studies in the field of real estate and in other asset classes. Both undergraduate and masters level students, as well as real estate analysts in the professions, will find this book to be essential reading.
This open access book focuses on the concepts, tools and techniques needed to successfully model ever-changing time-series data. It emphasizes the need for general models to account for the complexities of the modern world and how these can be applied to a range of issues facing Earth, from modelling volcanic eruptions, carbon dioxide emissions and global temperatures, to modelling unemployment rates, wage inflation and population growth. Except where otherwise noted, this book is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0.
This textbook articulates the elements of good craftsmanship in applied microeconomic research and demonstrates its effectiveness with multiple examples from economic literature. Empirical economic research is a combination of several elements: theory, econometric modelling, institutional analysis, data handling, estimation, inference, and interpretation. A large body of work demonstrates how to do many of these things correctly, but to date, there is no central resource available which articulates the essential principles involved and ties them together. In showing how these research elements can be best blended to maximize the credibility and impact of the findings that result, this book presents a basic framework for thinking about craftsmanship. This framework lays out the proper context within which the researcher should view the analysis, involving institutional factors, complementary policy instruments, and competing hypotheses that can influence or explain the phenomena being studied. It also emphasizes the interconnectedness of theory, econometric modeling, data, estimation, inference, and interpretation, arguing that good craftsmanship requires strong links between each. Once the framework has been set, the book devotes a chapter to each element of the analysis, providing robust instruction for each case. Assuming a working knowledge of econometrics, this text is aimed at graduate students and early-career academic researchers as well as empirical economists looking to improve their technique.
The combined efforts of the Physicists and the Economists in recent years in a- lyzing and modeling various dynamic phenomena in monetary and social systems have led to encouragingdevelopments,generally classi?ed under the title of Eco- physics. These developmentsshare a commonambitionwith the alreadyestablished ?eld of Quantitative Economics. This volume intends to offer the reader a glimpse of these two parallel initiatives by collecting review papers written by well-known experts in the respective research frontiers in one cover. This massive book presents a unique combination of research papers contributed almost equally by Physicists and Economists. Additional contributions from C- puter Scientists and Mathematicians are also included in this volume. It consists of two parts: The ?rst part concentrates on econophysics of games and social choices and is the proceedings of the Econophys-Kolkata IV workshop held at the Indian Statistical Institute and the Saha Institute of Nuclear Physics, both in Kolkata, d- ing March 9-13, 2009. The second part consists of contributionsto quantitative e- nomics by experts in connection with the Platinum Jubilee celebration of the Indian Statistical Institute. In this connectiona Forewordfor the volume, written by Sankar K. Pal, Director of the Indian Statistical Institute, is put forth. Both parts specialize mostly on frontier problems in games and social choices. The?rst partofthebookdealswith severalrecentdevelopmentsineconophysics. Game theory is integral to the formulation of modern economic analysis. Often games display a situation where the social optimal could not be reached as a - sult of non co-operation between different agents.
Using data from the World Values Survey, this book sheds light on the link between happiness and the social group to which one belongs. The work is based on a rigorous statistical analysis of differences in the probability of happiness and life satisfaction between the predominant social group and subordinate groups. The cases of India and South Africa receive deep attention in dedicated chapters on cast and race, with other chapters considering issues such as cultural bias, religion, patriarchy, and gender. An additional chapter offers a global perspective. On top of this, the longitudinal nature of the data facilitates an examination of how world happiness has evolved between 1994 and 2014. This book will be a valuable reference for advanced students, scholars and policymakers involved in development economics, well-being, development geography, and sociology.
This two volume set is a comprehensive collection of historical and contemporary articles which highlight the theoretical foundations and the methods and models of long wave analysis. After examining the beginnings of long wave theory, the book includes discussions of time series methods and non-linear modelling, with an exploration of economic development in its historical context. It investigates the process of evolution and mutation in industrial capitalism over the last two hundred years. Contemporary reviews and critiques of long wave theory are also included. It makes available for the first time much important material that has hitherto been inaccessible. The book will be of immense value to all students and scholars interested in the history of economic thought, time series analysis and evolutionary or institutionalist analysis.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.
It is commonly believed that macroeconomic models are not useful for policy analysis because they do not take proper account of agents' expectations. Over the last decade, mainstream macroeconomic models in the UK and elsewhere have taken on board the Rational Expectations Revolution' by explicitly incorporating expectations of the future. In principle, one can perform the same technical exercises on a forward expectations model as on a conventional model -- and more! Rational Expectations in Macroeconomic Models deals with the numerical methods necessary to carry out policy analysis and forecasting with these models. These methods are often passed on by word of mouth or confined to obscure journals. Rational Expectations in Macroeconomic Models brings them together with applications which are interesting in their own right. There is no comparable textbook in the literature. The specific subjects include: (i) solving for model consistent expectations; (ii) the choice of terminal condition and time horizon; (iii) experimental design: i.e., the effect of temporary vs permanent, anticipated vs. unanticipated shocks; deterministic vs. stochastic, dynamic vs. static simulation; (iv) the role of exchange rate; (v) optimal control and inflation-output tradeoffs. The models used are those of the Liverpool Research Group in Macroeconomics, the London Business School and the National Institute of Economic and Social Research.
This second volume of the late Julian Simon's articles and essays continues the theme of volume one in presenting unorthodox and controversial approaches to many fields in economics.The book features a wide range of papers divided into eight parts with a biographical introduction to the author's career and intellectual development as well as personal revelations about his background. Part One contains essays on statistics and probability which are developed in the second section on theoretical and applied econometrics. The third part considers individual behavior, including discussion of the effects of income on suicide rates and successive births, and foster care. Parts four and five present papers on population and migration, for which the author is best known. The sixth part contains Professor Simon's controversial discussion of natural resources and the articles in part seven relate to welfare analysis. In the final part some of the author's previously unpublished papers are presented, including discussions on duopoly and economists' thinking. Like the first volume this collection will be of interest to academics and students welcoming controversial and unorthodox approaches to a wide variety of theories and concepts in economics.
This book aims to fill the gap between panel data econometrics textbooks, and the latest development on 'big data', especially large-dimensional panel data econometrics. It introduces important research questions in large panels, including testing for cross-sectional dependence, estimation of factor-augmented panel data models, structural breaks in panels and group patterns in panels. To tackle these high dimensional issues, some techniques used in Machine Learning approaches are also illustrated. Moreover, the Monte Carlo experiments, and empirical examples are also utilised to show how to implement these new inference methods. Large-Dimensional Panel Data Econometrics: Testing, Estimation and Structural Changes also introduces new research questions and results in recent literature in this field.
Various imperfections in existing market systems prevent the free market from serving as a truly efficient allocation mechanism, but optimization of economic activities provides an effective remedial measure. Cooperative optimization claims that socially optimal and individually rational solutions to decision problems involving strategic action over time exist. To ensure that cooperation will last throughout the agreement period, however, the stringent condition of subgame consistency is required. This textbook presents a study of subgame consistent economic optimization, developing game-theoretic optimization techniques to establish the foundation for an effective policy menu to tackle the suboptimal behavior that the conventional market mechanism fails to resolve.
Vector autoregressive (VAR) models are among the most widely used econometric tools in the fields of macroeconomics and financial economics. Much of what we know about the response of the economy to macroeconomic shocks and about how various shocks have contributed to the evolution of macroeconomic and financial aggregates is based on VAR models. VAR models also have been used successfully for economic and business forecasting, for modelling risk and volatility, and for the construction of forecast scenarios. Since the introduction of VAR models by C.A. Sims in 1980, the VAR methodology has continuously evolved. Even today important extensions and reinterpretations of the VAR framework are being developed. Examples include VAR models for mixed-frequency data, VAR models as approximations to DSGE models, factor-augmented VAR models, new tools for the identification of structural shocks in VAR models, panel VAR approaches, and time-varying parameter VAR models. This volume collects contributions from some of the leading VAR experts in the world on VAR methods and applications. Each chapter highlights and synthesizes a new development in this literature in a way that is accessible to practitioners, to graduate students, and to readers in other fields.
Market Analysis for Real Estate is a comprehensive introduction to how real estate markets work and the analytical tools and techniques that can be used to identify and interpret market signals. The markets for space and varied property assets, including residential, office, retail, and industrial, are presented, analyzed, and integrated into a complete understanding of the role of real estate markets within the workings of contemporary urban economies. Unlike other books on market analysis, the economic and financial theory in this book is rigorous and well integrated with the specifics of the real estate market. Furthermore, it is thoroughly explained as it assumes no previous coursework in economics or finance on the part of the reader. The theoretical discussion is backed up with numerous real estate case study examples and problems, which are presented throughout the text to assist both student and teacher. Including discussion questions, exercises, several web links, and online slides, this textbook is suitable for use on a variety of degree programs in real estate, finance, business, planning, and economics at undergraduate and MSc/MBA level. It is also a useful primer for professionals in these disciplines.
Standard methods for estimating empirical models in economics and many other fields rely on strong assumptions about functional forms and the distributions of unobserved random variables. Often, it is assumed that functions of interest are linear or that unobserved random variables are normally distributed. Such assumptions simplify estimation and statistical inference but are rarely justified by economic theory or other a priori considerations. Inference based on convenient but incorrect assumptions about functional forms and distributions can be highly misleading. Nonparametric and semiparametric statistical methods provide a way to reduce the strength of the assumptions required for estimation and inference, thereby reducing the opportunities for obtaining misleading results. These methods are applicable to a wide variety of estimation problems in empirical economics and other fields, and they are being used in applied research with increasing frequency. The literature on nonparametric and semiparametric estimation is large and highly technical. This book presents the main ideas underlying a variety of nonparametric and semiparametric methods. It is accessible to graduate students and applied researchers who are familiar with econometric and statistical theory at the level taught in graduate-level courses in leading universities. The book emphasizes ideas instead of technical details and provides as intuitive an exposition as possible. Empirical examples illustrate the methods that are presented. This book updates and greatly expands the author's previous book on semiparametric methods in econometrics. Nearly half of the material is new.
Designed to promote students' understanding of econometrics and to build a more operational knowledge of economics through a meaningful combination of words, symbols and ideas. Each chapter commences in the way economists begin new empirical projects--with a question and an economic model--then proceeds to develop a statistical model, select an estimator and outline inference procedures. Contains a copious amount of problems, experimental exercises and case studies.
This ambitious book looks 'behind the model' to reveal how economists use formal models to generate insights into the economy. Drawing on recent work in the philosophy of science and economic methodology, the book presents a novel framework for understanding the logic of economic modeling. It also reveals the ways in which economic models can mislead rather than illuminate. Importantly, the book goes beyond purely negative critique, proposing a concrete program of methodological reform to better equip economists to detect potential mismatches between their models and the targets of their inquiry. Ranging across economics, philosophy, and social science methods, and drawing on a variety of examples, including the recent financial crisis, Behind the Model will be of interest to anyone who has wondered how economics works - and why it sometimes fails so spectacularly.
This proceedings volume presents the latest scientific research and trends in experimental economics, with particular focus on neuroeconomics. Derived from the 2016 Computational Methods in Experimental Economics (CMEE) conference held in Szczecin, Poland, this book features research and analysis of novel computational methods in neuroeconomics. Neuroeconomics is an interdisciplinary field that combines neuroscience, psychology and economics to build a comprehensive theory of decision making. At its core, neuroeconomics analyzes the decision-making process not only in terms of external conditions or psychological aspects, but also from the neuronal point of view by examining the cerebral conditions of decision making. The application of IT enhances the possibilities of conducting such analyses. Such studies are now performed by software that provides interaction among all the participants and possibilities to register their reactions more accurately. This book examines some of these applications and methods. Featuring contributions on both theory and application, this book is of interest to researchers, students, academics and professionals interested in experimental economics, neuroeconomics and behavioral economics.
This book addresses the disparities that arise when measuring and modeling societal behavior and progress across the social sciences. It looks at why and how different disciplines and even researchers can use the same data and yet come to different conclusions about equality of opportunity, economic and social mobility, poverty and polarization, and conflict and segregation. Because societal behavior and progress exist only in the context of other key aspects, modeling becomes exponentially more complex as more of these aspects are factored into considerations. The content of this book transcends disciplinary boundaries, providing valuable information on measuring and modeling to economists, sociologists, and political scientists who are interested in data-based analysis of pressing social issues.
Financial crises often transmit across geographical borders and different asset classes. Modeling these interactions is empirically challenging, and many of the proposed methods give different results when applied to the same data sets. In this book the authors set out their work on a general framework for modeling the transmission of financial crises using latent factor models. They show how their framework encompasses a number of other empirical contagion models and why the results between the models differ. The book builds a framework which begins from considering contagion in the bond markets during 1997-1998 across a number of countries, and culminates in a model which encompasses multiple assets across multiple countries through over a decade of crisis events from East Asia in 1997-1998 to the sub prime crisis during 2008. Program code to support implementation of similar models is available. |
You may like...
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R7,224
Discovery Miles 72 240
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
|