![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
In the last 20 years, econometric theory on panel data has developed rapidly, particularly for analyzing common behaviors among individuals over time. Meanwhile, the statistical methods employed by applied researchers have not kept up-to-date. This book attempts to fill in this gap by teaching researchers how to use the latest panel estimation methods correctly. Almost all applied economics articles use panel data or panel regressions. However, many empirical results from typical panel data analyses are not correctly executed. This book aims to help applied researchers to run panel regressions correctly and avoid common mistakes. The book explains how to model cross-sectional dependence, how to estimate a few key common variables, and how to identify them. It also provides guidance on how to separate out the long-run relationship and common dynamic and idiosyncratic dynamic relationships from a set of panel data. Aimed at applied researchers who want to learn about panel data econometrics by running statistical software, this book provides clear guidance and is supported by a full range of online teaching and learning materials. It includes practice sections on MATLAB, STATA, and GAUSS throughout, along with short and simple econometric theories on basic panel regressions for those who are unfamiliar with econometric theory on traditional panel regressions.
This book explores how econometric modelling can be used to provide valuable insight into international housing markets. Initially describing the role of econometrics modelling in real estate market research and how it has developed in recent years, the book goes on to compare and contrast the impact of various macroeconomic factors on developed and developing housing markets. Explaining the similarities and differences in the impact of financial crises on housing markets around the world, the author's econometric analysis of housing markets across the world provides a broad and nuanced perspective on the impact of both international financial markets and local macro economy on housing markets. With discussion of countries such as China, Germany, UK, US and South Africa, the lessons learned will be of interest to scholars of Real Estate economics around the world.
In the light of better and more detailed administrative databases, this open access book provides statistical tools for evaluating the effects of public policies advocated by governments and public institutions. Experts from academia, national statistics offices and various research centers present modern econometric methods for an efficient data-driven policy evaluation and monitoring, assess the causal effects of policy measures and report on best practices of successful data management and usage. Topics include data confidentiality, data linkage, and national practices in policy areas such as public health, education and employment. It offers scholars as well as practitioners from public administrations, consultancy firms and nongovernmental organizations insights into counterfactual impact evaluation methods and the potential of data-based policy and program evaluation.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
The volume examines the state-of-the-art of productivity and efficiency analysis. It brings together a selection of the best papers from the 10th North American Productivity Workshop. By analyzing world-wide perspectives on challenges that local economies and institutions may face when changes in productivity are observed, readers can quickly assess the impact of productivity measurement, productivity growth, dynamics of productivity change, measures of labor productivity, measures of technical efficiency in different sectors, frontier analysis, measures of performance, industry instability and spillover effects. The contributions in this volume focus on the theory and application of economics, econometrics, statistics, management science and operational research related to problems in the areas of productivity and efficiency measurement. Popular techniques and methodologies including stochastic frontier analysis and data envelopment analysis are represented. Chapters also cover broader issues related to measuring, understanding, incentivizing and improving the productivity and performance of firms, public services, and industries.
Get up to speed on the application of machine learning approaches in macroeconomic research. This book brings together economics and data science. Author Tshepo Chris Nokeri begins by introducing you to covariance analysis, correlation analysis, cross-validation, hyperparameter optimization, regression analysis, and residual analysis. In addition, he presents an approach to contend with multi-collinearity. He then debunks a time series model recognized as the additive model. He reveals a technique for binarizing an economic feature to perform classification analysis using logistic regression. He brings in the Hidden Markov Model, used to discover hidden patterns and growth in the world economy. The author demonstrates unsupervised machine learning techniques such as principal component analysis and cluster analysis. Key deep learning concepts and ways of structuring artificial neural networks are explored along with training them and assessing their performance. The Monte Carlo simulation technique is applied to stimulate the purchasing power of money in an economy. Lastly, the Structural Equation Model (SEM) is considered to integrate correlation analysis, factor analysis, multivariate analysis, causal analysis, and path analysis. After reading this book, you should be able to recognize the connection between econometrics and data science. You will know how to apply a machine learning approach to modeling complex economic problems and others beyond this book. You will know how to circumvent and enhance model performance, together with the practical implications of a machine learning approach in econometrics, and you will be able to deal with pressing economic problems. What You Will Learn Examine complex, multivariate, linear-causal structures through the path and structural analysis technique, including non-linearity and hidden states Be familiar with practical applications of machine learning and deep learning in econometrics Understand theoretical framework and hypothesis development, and techniques for selecting appropriate models Develop, test, validate, and improve key supervised (i.e., regression and classification) and unsupervised (i.e., dimension reduction and cluster analysis) machine learning models, alongside neural networks, Markov, and SEM models Represent and interpret data and models Who This Book Is For Beginning and intermediate data scientists, economists, machine learning engineers, statisticians, and business executives
This book presents the principles and methods for the practical analysis and prediction of economic and financial time series. It covers decomposition methods, autocorrelation methods for univariate time series, volatility and duration modeling for financial time series, and multivariate time series methods, such as cointegration and recursive state space modeling. It also includes numerous practical examples to demonstrate the theory using real-world data, as well as exercises at the end of each chapter to aid understanding. This book serves as a reference text for researchers, students and practitioners interested in time series, and can also be used for university courses on econometrics or computational finance.
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
This book systematically provides a prospective integrated approach for complexity social science in its view of statistical physics and mathematics, with an impressive collection of the knowledge and expertise of leading researchers from all over the world. The book mainly covers both finitary methods of statistical equilibrium and data-driven analysis by econophysics. The late Professor Masanao Aoki of UCLA, who passed away at the end of July 2018, in his later years dedicated himself to the reconstruction of macroeconomics mainly in terms of statistical physics. Professor Aoki, who was already an IEEE fellow, was also named an Econometric Society Fellow in 1979. Until the early 1990s, however, his contributions were focused on the new developments of a novel algorithm for the time series model and their applications to economic data. Those contributions were undoubtedly equivalent to the Nobel Prize-winning work of Granger's "co-integration method". After the publications of his New Approaches to Macroeconomic Modeling and Modeling Aggregate Behavior and Fluctuations in Economics, both published by Cambridge University Press, in 1996 and 2002, respectively, his contributions rapidly became known and spread throughout the field. In short, these new works challenged econophysicists to develop evolutionary stochastic dynamics, multiple equilibria, and externalities as field effects and revolutionized the stochastic views of interacting agents. In particular, the publication of Reconstructing Macroeconomics, also by Cambridge University Press (2007), in cooperation with Hiroshi Yoshikawa, further sharpened the process of embodying "a perspective from statistical physics and combinatorial stochastic processes" in economic modeling. Interestingly, almost concurrently with Prof. Aoki's newest development, similar approaches were appearing. Thus, those who were working in the same context around the world at that time came together, exchanging their results during the past decade. In memory of Prof. Aoki, this book has been planned by authors who followed him to present the most advanced outcomes of his heritage.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
Many problems in statistics and econometrics offer themselves naturally to
optimization in statistics and econometrics, followed by detailed discussion of a relatively new and very powerful optimization heuristic, threshold accepting. The final part consists of many applications of the methods described earlier, encompassing experimental design, model selection, aggregation of tiime series, and censored quantile regression models. Those researching and working in econometrics, statistics and operations research are given the tools to apply optimization heuristic methods in their work. Postgraduate students of statistics and econometrics will find the book provides a good introduction to optimization heuristic methods.
This book surveys the state-of-the-art in efficiency and productivity analysis, examining advances in the analytical foundations and empirical applications. The analytical techniques developed in this book for efficiency provide alternative ways of defining optimum outcome sets, typically as a (technical) production frontier or as an (economic) cost, revenue or profit frontier, and alternative ways of measuring efficiency relative to an appropriate frontier. Simultaneously, the analytical techniques developed for efficiency analysis extend directly to productivity analysis, thereby providing alternative methods for estimating productivity levels, and productivity change through time or productivity variation across producers. This book includes chapters using data envelopment analysis (DEA) or stochastic frontier analysis (SFA) as quantitative techniques capable of measuring efficiency and productivity. Across the book's 15 chapters, it broadly extends into popular application areas including agriculture, banking and finance, and municipal performance, and relatively new application areas including corporate social responsibility, the value of intangible assets, land consolidation, and the measurement of economic well-being. The chapters also cover topics such as permutation tests for production frontier shifts, new indices of total factor productivity, and also randomized controlled trials and production frontiers.
This textbook for master programs in economics offers a comprehensive overview of microeconomics. It employs a carefully graded approach where basic game theory concepts are already explained within the simpler decision framework. The unavoidable mathematical content is supplied when needed, not in an appendix. The book covers a lot of ground, from decision theory to game theory, from bargaining to auction theory, from household theory to oligopoly theory, and from the theory of general equilibrium to regulation theory. Additionally, cooperative game theory is introduced. This textbook has been recommended and developed for university courses in Germany, Austria and Switzerland.
The book comprises three chapters, with each chapter assigned various type data such as time series data, cross sectional data and panel data. The purpose of this book is to explore the economic and social determinant factors of fertility. Unlike many previous empirical analyses of fertility and the related demographic events, this research has three characteristics. The first is that the relationship between fertility and labor participation by females is thoroughly considered, with much discussion about the structural change between those factors. The second is that time series analysis such as the Bayesian vector autoregressive (BVAR) model or co-integration concepts is applied to explore the determinant factors of fertility. The third is that the effectiveness of public policies related to improve fertility is confirmed. In recent years, micro-econometric analysis has become popular; however, this book takes another approach from the perspective of macro- or semi-macro-econometrics.
Gary Madden was a renaissance man with respect to the nexus between information and communications technology (ICT) and economics. He contributed to a variety of fields in ICT: applied econometrics, forecasting, internet governance and policy. This series of essays, two of which were co-authored by Professor Madden prior to his untimely death, cover the range of his research interests. While the essays focus on a number of ICT issues, they are on the frontier of research in the sector. Gerard Faulhaber provides a broad overview of how we have reached the digital age and its implications. The applied econometric section brings the latest research in the area, for example Lester Taylor illustrates how own-price, cross-price and income elasticities can be calculated from survey data and translated into real income effects. The forecasting section ranges from forecasting online political participation to broadband's impact on economic growth. The final section covers aspects of governance and regulation of the ICT sector.
This second edition of Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is organized into five parts. Chapters 2, 3, and 5 of Part I cover concisely many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates, and includes an updated chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV is new to this edition; it discusses evidence factors and the computerized construction of more than one comparison group. Part V discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies: "make your theories elaborate." This new edition features updated exploration of causal influence, with four new chapters, a new R package DOS2 designed as a companion for the book, and discussion of several of the latest matching packages for R. In particular, DOS2 allows readers to reproduce many analyses from Design of Observational Studies.
"The adoption of stable modeling in finance and econometrics is undoubtedly one of the most interesting and promising ideas which has arisen in these fields. It is now widely accepted that classical models for the description of the dynamics of financial and economic variable suffer form major structural weaknesses, as they fail to explain important features of the empirical data. Therefore, the search for new more powerful models is a fundamental and fascinating topic of research. In this book, Rachev and Mittnik, two of the most prominent experts in so-called Stable Finance, present a wealth of convincing arguments to support the claim that stable models offer the right approach to the subject. Their monograph, which collects a large part of the authors' work in sable financial modeling, brings together innovative insights as well as new elegant explanations financial and economic phenomena..."
This book presents the state of the art in extreme value theory, with a collection of articles related to a seminal paper on the bivariate extreme value distribution written by Professor Masaaki Sibuya in 1960, demonstrating various developments of the original idea over the last half-century. Written by active researchers, the unique combination of articles allows readers to gain a sense of the excellence of the field, ranging from theory to practice, and the tradition of theoretical developments motivated by practically important issues such as tsunamis and financial crises. The contributions discuss a range of topics, including the parameter estimation of the generalized beta distribution, resampling with the empirical beta copula, and regression analysis on imbalanced binary data, as well as the semiparametric estimation of the upper bound of extrema, the long-term analysis of extreme precipitation over Japanese river basins, and various rules of thumb in hydrology.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
In nonparametric and high-dimensional statistical models, the classical Gauss-Fisher-Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.
Computable general equilibrium (CGE) models play an important role in supporting public-policy making on such issues as trade, climate change and taxation. This significantly revised volume, keeping pace with the next-generation standard CGE model, is the only undergraduate-level introduction of its kind. The volume utilizes a graphical approach to explain the economic theory underlying a CGE model, and provides results from simple, small-scale CGE models to illustrate the links between theory and model outcomes. Its eleven hands-on exercises introduce modelling techniques that are applied to real-world economic problems. Students learn how to integrate their separate fields of economic study into a comprehensive, general equilibrium perspective as they develop their skills as producers or consumers of CGE-based analysis.
This proceedings volume presents new methods and applications in applied economics with special interest in advanced cross-section data estimation methodology. Featuring select contributions from the 2019 International Conference on Applied Economics (ICOAE 2019) held in Milan, Italy, this book explores areas such as applied macroeconomics, applied microeconomics, applied financial economics, applied international economics, applied agricultural economics, applied marketing and applied managerial economics. International Conference on Applied Economics (ICOAE) is an annual conference that started in 2008, designed to bring together economists from different fields of applied economic research, in order to share methods and ideas. Applied economics is a rapidly growing field of economics that combines economic theory with econometrics, to analyze economic problems of the real world, usually with economic policy interest. In addition, there is growing interest in the field of applied economics for cross-section data estimation methods, tests and techniques. This volume makes a contribution in the field of applied economic research by presenting the most current research. Featuring country specific studies, this book is of interest to academics, students, researchers, practitioners, and policy makers in applied economics, econometrics and economic policy.
This Brief discusses impacts of the COVID-19 pandemic on the Portuguese tourism sector. Taking into account real-world conditions and the importance of the tourism sector for the Portuguese economy, this book highlights the economic contexts of tourism in Portugal at the regional and municipal levels, discussing pre-pandemic economic frameworks and projecting potential implications for the future. Using data provided by Statistics Portugal, the Brief performs econometric analysis on three cases: new paradigms for overnight stays and guests, changes in tourism revenues and prospective alternatives, and a comparison of effects on changes in number of guests and overnight stays at the regional level. Providing cutting edge analysis of a dynamic global situation, this Brief will be useful for researchers interested in tourism economics and European economics as well as policymakers and industry professionals.
This book surveys big data tools used in macroeconomic forecasting and addresses related econometric issues, including how to capture dynamic relationships among variables; how to select parsimonious models; how to deal with model uncertainty, instability, non-stationarity, and mixed frequency data; and how to evaluate forecasts, among others. Each chapter is self-contained with references, and provides solid background information, while also reviewing the latest advances in the field. Accordingly, the book offers a valuable resource for researchers, professional forecasters, and students of quantitative economics. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R7,224
Discovery Miles 72 240
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
|