Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
Gary Madden was a renaissance man with respect to the nexus between information and communications technology (ICT) and economics. He contributed to a variety of fields in ICT: applied econometrics, forecasting, internet governance and policy. This series of essays, two of which were co-authored by Professor Madden prior to his untimely death, cover the range of his research interests. While the essays focus on a number of ICT issues, they are on the frontier of research in the sector. Gerard Faulhaber provides a broad overview of how we have reached the digital age and its implications. The applied econometric section brings the latest research in the area, for example Lester Taylor illustrates how own-price, cross-price and income elasticities can be calculated from survey data and translated into real income effects. The forecasting section ranges from forecasting online political participation to broadband's impact on economic growth. The final section covers aspects of governance and regulation of the ICT sector.
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
As one of the first texts to take a behavioral approach to macroeconomic expectations, this book introduces a new way of doing economics. Roetheli uses cognitive psychology in a bottom-up method of modeling macroeconomic expectations. His research is based on laboratory experiments and historical data, which he extends to real-world situations. Pattern extrapolation is shown to be the key to understanding expectations of inflation and income. The quantitative model of expectations is used to analyze the course of inflation and nominal interest rates in a range of countries and historical periods. The model of expected income is applied to the analysis of business cycle phenomena such as the great recession in the United States. Data and spreadsheets are provided for readers to do their own computations of macroeconomic expectations. This book offers new perspectives in many areas of macro and financial economics.
This proceedings volume presents new methods and applications in applied economics with special interest in advanced cross-section data estimation methodology. Featuring select contributions from the 2019 International Conference on Applied Economics (ICOAE 2019) held in Milan, Italy, this book explores areas such as applied macroeconomics, applied microeconomics, applied financial economics, applied international economics, applied agricultural economics, applied marketing and applied managerial economics. International Conference on Applied Economics (ICOAE) is an annual conference that started in 2008, designed to bring together economists from different fields of applied economic research, in order to share methods and ideas. Applied economics is a rapidly growing field of economics that combines economic theory with econometrics, to analyze economic problems of the real world, usually with economic policy interest. In addition, there is growing interest in the field of applied economics for cross-section data estimation methods, tests and techniques. This volume makes a contribution in the field of applied economic research by presenting the most current research. Featuring country specific studies, this book is of interest to academics, students, researchers, practitioners, and policy makers in applied economics, econometrics and economic policy.
Virtually any random process developing chronologically can be viewed as a time series. In economics closing prices of stocks, the cost of money, the jobless rate, and retail sales are just a few examples of many. Developed from course notes and extensively classroom-tested, Applied Time Series Analysis with R, Second Edition includes examples across a variety of fields, develops theory, and provides an R-based software package to aid in addressing time series problems in a broad spectrum of fields. The material is organized in an optimal format for graduate students in statistics as well as in the natural and social sciences to learn to use and understand the tools of applied time series analysis. Features Gives readers the ability to actually solve significant real-world problems Addresses many types of nonstationary time series and cutting-edge methodologies Promotes understanding of the data and associated models rather than viewing it as the output of a "black box" Provides the R package tswge available on CRAN which contains functions and over 100 real and simulated data sets to accompany the book. Extensive help regarding the use of tswge functions is provided in appendices and on an associated website. Over 150 exercises and extensive support for instructors The second edition includes additional real-data examples, uses R-based code that helps students easily analyze data, generate realizations from models, and explore the associated characteristics. It also adds discussion of new advances in the analysis of long memory data and data with time-varying frequencies (TVF).
This selection of Professor Dhrymes's major papers combines important contributions to econometric theory with a series of well-thought-out, skilfully-executed empirical studies. The theoretical papers focus on such issues as the general linear model, simultaneous equations models, distributed lags and ancillary topics. Most of these papers originated with problems encountered in empirical research. The applied studies deal with production function and productivity topics, demand for labour, arbitrage pricing theory, demand for housing and related issues. Featuring careful exposition of key techniques combined with relevant theory and illustrations of possible applications, this book will be welcomed by academic and professional economists concerned with the use of econometric techniques and their underlying theory.
This volume, edited by Jeffrey Racine, Liangjun Su, and Aman Ullah, contains the latest research on nonparametric and semiparametric econometrics and statistics. These data-driven models seek to replace the "classical " parametric models of the past, which were rigid and often linear. Chapters by leading international econometricians and statisticians highlight the interface between econometrics and statistical methods for nonparametric and semiparametric procedures. They provide a balanced view of new developments in the analysis and modeling of applied sciences with cross-section, time series, panel, and spatial data sets. The major topics of the volume include: the methodology of semiparametric models and special regressor methods; inverse, ill-posed, and well-posed problems; different methodologies related to additive models; sieve regression estimators, nonparametric and semiparametric regression models, and the true error of competing approximate models; support vector machines and their modeling of default probability; series estimation of stochastic processes and some of their applications in Econometrics; identification, estimation, and specification problems in a class of semilinear time series models; nonparametric and semiparametric techniques applied to nonstationary or near nonstationary variables; the estimation of a set of regression equations; and a new approach to the analysis of nonparametric models with exogenous treatment assignment.
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
'Experiments in Organizational Economics' highlights the importance of replicating previous economic experiments. Replication enables experimental findings to be subjected to rigorous scrutiny. Despite this obvious advantage, direct replication remains relatively scant in economics. One possible explanation for this situation is that publication outlets favor novel work over tests of robustness. Readers will gain a better understanding of the role that replication plays in economic discovery as well as valuable insights into the robustness of previously reported findings.
Technical analysis points out that the best source of information to beat the market is the price itself. Introducing readers to technical analysis in a more succinct and practical way, Ramlall focuses on the key aspects, benefits, drawbacks, and the main tools of technical analysis. Chart Patterns, Point & Figure, Stochastics, Sentiment indicators, Elliot Wave Theory, RSI, R, Candlesticks and more are covered, including both the concepts and the practical applications. Also including programming technical analysis tools, this book is a valuable tool for both researchers and practitioners.
This book presents selected peer-reviewed contributions from the International Conference on Time Series and Forecasting, ITISE 2018, held in Granada, Spain, on September 19-21, 2018. The first three parts of the book focus on the theory of time series analysis and forecasting, and discuss statistical methods, modern computational intelligence methodologies, econometric models, financial forecasting, and risk analysis. In turn, the last three parts are dedicated to applied topics and include papers on time series analysis in the earth sciences, energy time series forecasting, and time series analysis and prediction in other real-world problems. The book offers readers valuable insights into the different aspects of time series analysis and forecasting, allowing them to benefit both from its sophisticated and powerful theory, and from its practical applications, which address real-world problems in a range of disciplines. The ITISE conference series provides a valuable forum for scientists, engineers, educators and students to discuss the latest advances and implementations in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
This book surveys big data tools used in macroeconomic forecasting and addresses related econometric issues, including how to capture dynamic relationships among variables; how to select parsimonious models; how to deal with model uncertainty, instability, non-stationarity, and mixed frequency data; and how to evaluate forecasts, among others. Each chapter is self-contained with references, and provides solid background information, while also reviewing the latest advances in the field. Accordingly, the book offers a valuable resource for researchers, professional forecasters, and students of quantitative economics.
Contemporary economists, when analyzing economic behavior of people, need to use the diversity of research methods and modern ways of discovering knowledge. The increasing popularity of using economic experiments requires the use of IT tools and quantitative methods that facilitate the analysis of the research material obtained as a result of the experiments and the formulation of correct conclusions. This proceedings volume presents problems in contemporary economics and provides innovative solutions using a range of quantitative and experimental tools. Featuring selected contributions presented at the 2018 Computational Methods in Experimental Economics Conference (CMEE 2018), this book provides a modern economic perspective on such important issues as: sustainable development, consumption, production, national wealth, the silver economy, behavioral finance, economic and non-economic factors determining the behavior of household members, consumer preferences, social campaigns, and neuromarketing. International case studies are also offered.
In this Element and its accompanying second Element, A Practical Introduction to Regression Discontinuity Designs: Extensions, Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an accessible and practical guide for the analysis and interpretation of regression discontinuity (RD) designs that encourages the use of a common set of practices and facilitates the accumulation of RD-based empirical evidence. In this Element, the authors discuss the foundations of the canonical Sharp RD design, which has the following features: (i) the score is continuously distributed and has only one dimension, (ii) there is only one cutoff, and (iii) compliance with the treatment assignment is perfect. In the second Element, the authors discuss practical and conceptual extensions to this basic RD setup.
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policymakers and business people with worldwide statistics on current performance and trends in the manufacturing sector. The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyse patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented.
Bjørn Lomborg, a former member of Greenpeace, challenges widely held beliefs that the world environmental situation is getting worse and worse in his new book, The Skeptical Environmentalist. Using statistical information from internationally recognized research institutes, Lomborg systematically examines a range of major environmental issues that feature prominently in headline news around the world, including pollution, biodiversity, fear of chemicals, and the greenhouse effect, and documents that the world has actually improved. He supports his arguments with over 2500 footnotes, allowing readers to check his sources. Lomborg criticizes the way many environmental organizations make selective and misleading use of scientific evidence and argues that we are making decisions about the use of our limited resources based on inaccurate or incomplete information. Concluding that there are more reasons for optimism than pessimism, he stresses the need for clear-headed prioritization of resources to tackle real, not imagined, problems. The Skeptical Environmentalist offers readers a non-partisan evaluation that serves as a useful corrective to the more alarmist accounts favored by campaign groups and the media. Bjørn Lomborg is an associate professor of statistics in the Department of Political Science at the University of Aarhus. When he started to investigate the statistics behind the current gloomy view of the environment, he was genuinely surprised. He published four lengthy articles in the leading Danish newspaper, including statistics documenting an ever-improving world, and unleashed the biggest post-war debate with more than 400 articles in all the major papers. Since then, Lomborg has been a frequent participant in the European debate on environmentalism on television, radio, and in newspapers.
This state-of-the-art account unifies material developed in journal articles over the last 35 years, with two central thrusts: It describes a broad class of system models that the authors call 'stochastic processing networks' (SPNs), which include queueing networks and bandwidth sharing networks as prominent special cases; and in that context it explains and illustrates a method for stability analysis based on fluid models. The central mathematical result is a theorem that can be paraphrased as follows: If the fluid model derived from an SPN is stable, then the SPN itself is stable. Two topics discussed in detail are (a) the derivation of fluid models by means of fluid limit analysis, and (b) stability analysis for fluid models using Lyapunov functions. With regard to applications, there are chapters devoted to max-weight and back-pressure control, proportionally fair resource allocation, data center operations, and flow management in packet networks. Geared toward researchers and graduate students in engineering and applied mathematics, especially in electrical engineering and computer science, this compact text gives readers full command of the methods.
A comprehensive account of economic size distributions around the world and throughout the years In the course of the past 100 years, economists and applied statisticians have developed a remarkably diverse variety of income distribution models, yet no single resource convincingly accounts for all of these models, analyzing their strengths and weaknesses, similarities and differences. Statistical Size Distributions in Economics and Actuarial Sciences is the first collection to systematically investigate a wide variety of parametric models that deal with income, wealth, and related notions. Christian Kleiber and Samuel Kotz survey, compliment, compare, and unify all of the disparate models of income distribution, highlighting at times a lack of coordination between them that can result in unnecessary duplication. Considering models from eight languages and all continents, the authors discuss the social and economic implications of each as well as distributions of size of loss in actuarial applications. Specific models covered include:
Three appendices provide brief biographies of some of the leading players along with the basic properties of each of the distributions. Actuaries, economists, market researchers, social scientists, and physicists interested in econophysics will find Statistical Size Distributions in Economics and Actuarial Sciences to be a truly one-of-a-kind addition to the professional literature.
This book examines whether continuous-time models in frictionless financial economies can be well approximated by discrete-time models. It specifically looks to answer the question: in what sense and to what extent does the famous Black-Scholes-Merton (BSM) continuous-time model of financial markets idealize more realistic discrete-time models of those markets? While it is well known that the BSM model is an idealization of discrete-time economies where the stock price process is driven by a binomial random walk, it is less known that the BSM model idealizes discrete-time economies whose stock price process is driven by more general random walks. Starting with the basic foundations of discrete-time and continuous-time models, David M. Kreps takes the reader through to this important insight with the goal of lowering the entry barrier for many mainstream financial economists, thus bringing less-technical readers to a better understanding of the connections between BSM and nearby discrete-economies.
An accessible, contemporary introduction to the methods for determining cause and effect in the social sciences "Causation versus correlation has been the basis of arguments-economic and otherwise-since the beginning of time. Causal Inference: The Mixtape uses legit real-world examples that I found genuinely thought-provoking. It's rare that a book prompts readers to expand their outlook; this one did for me."-Marvin Young (Young MC) Causal inference encompasses the tools that allow social scientists to determine what causes what. In a messy world, causal inference is what helps establish the causes and effects of the actions being studied-for example, the impact (or lack thereof) of increases in the minimum wage on employment, the effects of early childhood education on incarceration later in life, or the influence on economic growth of introducing malaria nets in developing regions. Scott Cunningham introduces students and practitioners to the methods necessary to arrive at meaningful answers to the questions of causation, using a range of modeling techniques and coding instructions for both the R and the Stata programming languages.
Quantile regression constitutes an ensemble of statistical techniques intended to estimate and draw inferences about conditional quantile functions. Median regression, as introduced in the 18th century by Boscovich and Laplace, is a special case. In contrast to conventional mean regression that minimizes sums of squared residuals, median regression minimizes sums of absolute residuals; quantile regression simply replaces symmetric absolute loss by asymmetric linear loss. Since its introduction in the 1970's by Koenker and Bassett, quantile regression has been gradually extended to a wide variety of data analytic settings including time series, survival analysis, and longitudinal data. By focusing attention on local slices of the conditional distribution of response variables it is capable of providing a more complete, more nuanced view of heterogeneous covariate effects. Applications of quantile regression can now be found throughout the sciences, including astrophysics, chemistry, ecology, economics, finance, genomics, medicine, and meteorology. Software for quantile regression is now widely available in all the major statistical computing environments. The objective of this volume is to provide a comprehensive review of recent developments of quantile regression methodology illustrating its applicability in a wide range of scientific settings. The intended audience of the volume is researchers and graduate students across a diverse set of disciplines.
New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, and signal extraction. They then move on to advanced topics, focusing on heteroscedastic models, nonlinear time series models, Bayesian time series analysis, nonparametric time series analysis, and neural networks. Multivariate time series coverage includes presentations on vector ARMA models, cointegration, and multivariate linear systems. Special features include:
Requiring no previous knowledge of the subject, A Course in Time Series Analysis is an important reference and a highly useful resource for researchers and practitioners in statistics, economics, business, engineering, and environmental analysis. |
You may like...
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Statistics for Business & Economics…
James McClave, P Benson, …
Paperback
R2,304
Discovery Miles 23 040
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
R2,178
Discovery Miles 21 780
|