![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
From the 1980s onward, income inequality increased in many advanced countries. It is very difficult to account for the rise in income inequality using the standard labour supply/demand explanation. Fiscal redistribution has become less effective in compensating increasing inequalities since the 1990s. Some of the basic features of redistribution can be explained through the optimal tax framework developed by J. A. Mirrlees in 1971. This Element surveys some of the earlier results in linear and nonlinear taxation and produces some new numerical results. Given the key role of capital income in the overall income inequality, it also considers the optimal taxation of capital income. It examines empirically the relationship between the extent of redistribution and the components of the Mirrlees framework. The redistributive role of factors such as publicly provided private goods, public employment, endogenous wages in the overlapping generations model and income uncertainty are analysed.
In How to Make the World Add Up, Tim Harford draws on his experience as both an economist and presenter of the BBC's radio show 'More or Less' to take us deep into the world of disinformation and obfuscation, bad research and misplaced motivation to find those priceless jewels of data and analysis that make communicating with numbers so rewarding. Through vivid storytelling he reveals how we can evaluate the claims that surround us with confidence, curiosity and a healthy level of scepticism. It is a must-read for anyone who cares about understanding the world around them.
In recent years, interest in rigorous impact evaluation has grown tremendously in policy-making, economics, public health, social sciences and international relations. Evidence-based policy-making has become a recurring theme in public policy, alongside greater demands for accountability in public policies and public spending, and requests for independent and rigorous impact evaluations for policy evidence. Froelich and Sperlich offer a comprehensive and up-to-date approach to quantitative impact evaluation analysis, also known as causal inference or treatment effect analysis, illustrating the main approaches for identification and estimation: experimental studies, randomization inference and randomized control trials (RCTs), matching and propensity score matching and weighting, instrumental variable estimation, difference-in-differences, regression discontinuity designs, quantile treatment effects, and evaluation of dynamic treatments. The book is designed for economics graduate courses but can also serve as a manual for professionals in research institutes, governments, and international organizations, evaluating the impact of a wide range of public policies in health, environment, transport and economic development.
Focusing on Bayesian approaches and computations using analytic and simulation-based methods for inference, Time Series: Modeling, Computation, and Inference, Second Edition integrates mainstream approaches for time series modeling with significant recent developments in methodology and applications of time series analysis. It encompasses a graduate-level account of Bayesian time series modeling, analysis and forecasting, a broad range of references to state-of-the-art approaches to univariate and multivariate time series analysis, and contacts research frontiers in multivariate time series modeling and forecasting. It presents overviews of several classes of models and related methodology for inference, statistical computation for model fitting and assessment, and forecasting. It explores the connections between time- and frequency-domain approaches and develop various models and analyses using Bayesian formulations and computation, including use of computations based on Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) methods. It illustrates the models and methods with examples and case studies from a variety of fields, including signal processing, biomedicine, environmental science, and finance. Along with core models and methods, the book represents state-of-the art approaches to analysis and forecasting in challenging time series problems. It also demonstrates the growth of time series analysis into new application areas in recent years, and contacts recent and relevant modeling developments and research challenges. New in the second edition: Expanded on aspects of core model theory and methodology. Multiple new examples and exercises. Detailed development of dynamic factor models. Updated discussion and connections with recent and current research frontiers.
Modern economies are full of uncertainties and risk. Economics studies resource allocations in an uncertain market environment. As a generally applicable quantitative analytic tool for uncertain events, probability and statistics have been playing an important role in economic research. Econometrics is statistical analysis of economic and financial data. In the past four decades or so, economics has witnessed a so-called 'empirical revolution' in its research paradigm, and as the main methodology in empirical studies in economics, econometrics has been playing an important role. It has become an indispensable part of training in modern economics, business and management.This book develops a coherent set of econometric theory, methods and tools for economic models. It is written as a textbook for graduate students in economics, business, management, statistics, applied mathematics, and related fields. It can also be used as a reference book on econometric theory by scholars who may be interested in both theoretical and applied econometrics.
The "Theory of Macrojustice", introduced by S.-C. Kolm, is a stimulating contribution to the debate on the macroeconomic income distribution. The solution called "Equal Labour Income Equalisation" (ELIE) is the result of a three stages construction: collective agreement on the scheme of labour income redistribution, collective agreement on the degree of equalisation to be chosen in that framework, individual freedom to exploit his--her personal productive capicities (the source of labour income and the sole basis for taxation). This book is organised as a discussion around four complementary themes: philosophical aspects of macrojustice, economic analysis of macrojustice, combination of ELIE with other targeted tranfers, econometric evaluations of ELIE.
Bayesian econometric methods have enjoyed an increase in popularity in recent years. Econometricians, empirical economists, and policymakers are increasingly making use of Bayesian methods. This handbook is a single source for researchers and policymakers wanting to learn about Bayesian methods in specialized fields, and for graduate students seeking to make the final step from textbook learning to the research frontier. It contains contributions by leading Bayesians on the latest developments in their specific fields of expertise. The volume provides broad coverage of the application of Bayesian econometrics in the major fields of economics and related disciplines, including macroeconomics, microeconomics, finance, and marketing. It reviews the state of the art in Bayesian econometric methodology, with chapters on posterior simulation and Markov chain Monte Carlo methods, Bayesian nonparametric techniques, and the specialized tools used by Bayesian time series econometricians such as state space models and particle filtering. It also includes chapters on Bayesian principles and methodology.
Packed with insights, Lorenzo Bergomi's Stochastic Volatility Modeling explains how stochastic volatility is used to address issues arising in the modeling of derivatives, including: Which trading issues do we tackle with stochastic volatility? How do we design models and assess their relevance? How do we tell which models are usable and when does calibration make sense? This manual covers the practicalities of modeling local volatility, stochastic volatility, local-stochastic volatility, and multi-asset stochastic volatility. In the course of this exploration, the author, Risk's 2009 Quant of the Year and a leading contributor to volatility modeling, draws on his experience as head quant in Societe Generale's equity derivatives division. Clear and straightforward, the book takes readers through various modeling challenges, all originating in actual trading/hedging issues, with a focus on the practical consequences of modeling choices.
Born of a belief that economic insights should not require much mathematical sophistication, this book proposes novel and parsimonious methods to incorporate ignorance and uncertainty into economic modeling, without complex mathematics. Economics has made great strides over the past several decades in modeling agents' decisions when they are incompletely informed, but many economists believe that there are aspects of these models that are less than satisfactory. Among the concerns are that ignorance is not captured well in most models, that agents' presumed cognitive ability is implausible, and that derived optimal behavior is sometimes driven by the fine details of the model rather than the underlying economics. Compte and Postlewaite lay out a tractable way to address these concerns, and to incorporate plausible limitations on agents' sophistication. A central aspect of the proposed methodology is to restrict the strategies assumed available to agents.
Over the past two decades, experimental economics has moved from a fringe activity to become a standard tool for empirical research. With experimental economics now regarded as part of the basic tool-kit for applied economics, this book demonstrates how controlled experiments can be a useful in providing evidence relevant to economic research. Professors Jacquemet and L'Haridon take the standard model in applied econometrics as a basis to the methodology of controlled experiments. Methodological discussions are illustrated with standard experimental results. This book provides future experimental practitioners with the means to construct experiments that fit their research question, and new comers with an understanding of the strengths and weaknesses of controlled experiments. Graduate students and academic researchers working in the field of experimental economics will be able to learn how to undertake, understand and criticise empirical research based on lab experiments, and refer to specific experiments, results or designs completed with case study applications.
Quantile regression constitutes an ensemble of statistical techniques intended to estimate and draw inferences about conditional quantile functions. Median regression, as introduced in the 18th century by Boscovich and Laplace, is a special case. In contrast to conventional mean regression that minimizes sums of squared residuals, median regression minimizes sums of absolute residuals; quantile regression simply replaces symmetric absolute loss by asymmetric linear loss. Since its introduction in the 1970's by Koenker and Bassett, quantile regression has been gradually extended to a wide variety of data analytic settings including time series, survival analysis, and longitudinal data. By focusing attention on local slices of the conditional distribution of response variables it is capable of providing a more complete, more nuanced view of heterogeneous covariate effects. Applications of quantile regression can now be found throughout the sciences, including astrophysics, chemistry, ecology, economics, finance, genomics, medicine, and meteorology. Software for quantile regression is now widely available in all the major statistical computing environments. The objective of this volume is to provide a comprehensive review of recent developments of quantile regression methodology illustrating its applicability in a wide range of scientific settings. The intended audience of the volume is researchers and graduate students across a diverse set of disciplines.
To fully function in today's global real estate industry, students and professionals increasingly need to understand how to implement essential and cutting-edge quantitative techniques. This book presents an easy-to-read guide to applying quantitative analysis in real estate aimed at non-cognate undergraduate and masters students, and meets the requirements of modern professional practice. Through case studies and examples illustrating applications using data sourced from dedicated real estate information providers and major firms in the industry, the book provides an introduction to the foundations underlying statistical data analysis, common data manipulations and understanding descriptive statistics, before gradually building up to more advanced quantitative analysis, modelling and forecasting of real estate markets. Our examples and case studies within the chapters have been specifically compiled for this book and explicitly designed to help the reader acquire a better understanding of the quantitative methods addressed in each chapter. Our objective is to equip readers with the skills needed to confidently carry out their own quantitative analysis and be able to interpret empirical results from academic work and practitioner studies in the field of real estate and in other asset classes. Both undergraduate and masters level students, as well as real estate analysts in the professions, will find this book to be essential reading.
This is a thorough exploration of the models and methods of financial econometrics by one of the world's leading financial econometricians and is for students in economics, finance, statistics, mathematics, and engineering who are interested in financial applications. Based on courses taught around the world, the up-to-date content covers developments in econometrics and finance over the last twenty years while ensuring a solid grounding in the fundamental principles of the field. Care has been taken to link theory and application to provide real-world context for students. Worked exercises and empirical examples have also been included to make sure complicated concepts are solidly explained and understood.
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
Computational finance is increasingly important in the financial industry, as a necessary instrument for applying theoretical models to real-world challenges. Indeed, many models used in practice involve complex mathematical problems, for which an exact or a closed-form solution is not available. Consequently, we need to rely on computational techniques and specific numerical algorithms. This book combines theoretical concepts with practical implementation. Furthermore, the numerical solution of models is exploited, both to enhance the understanding of some mathematical and statistical notions, and to acquire sound programming skills in MATLAB (R), which is useful for several other programming languages also. The material assumes the reader has a relatively limited knowledge of mathematics, probability, and statistics. Hence, the book contains a short description of the fundamental tools needed to address the two main fields of quantitative finance: portfolio selection and derivatives pricing. Both fields are developed here, with a particular emphasis on portfolio selection, where the author includes an overview of recent approaches. The book gradually takes the reader from a basic to medium level of expertise by using examples and exercises to simplify the understanding of complex models in finance, giving them the ability to place financial models in a computational setting. The book is ideal for courses focusing on quantitative finance, asset management, mathematical methods for economics and finance, investment banking, and corporate finance.
Algorithmic Trading and Quantitative Strategies provides an in-depth overview of this growing field with a unique mix of quantitative rigor and practitioner's hands-on experience. The focus on empirical modeling and practical know-how makes this book a valuable resource for students and professionals. The book starts with the often overlooked context of why and how we trade via a detailed introduction to market structure and quantitative microstructure models. The authors then present the necessary quantitative toolbox including more advanced machine learning models needed to successfully operate in the field. They next discuss the subject of quantitative trading, alpha generation, active portfolio management and more recent topics like news and sentiment analytics. The last main topic of execution algorithms is covered in detail with emphasis on the state of the field and critical topics including the elusive concept of market impact. The book concludes with a discussion of the technology infrastructure necessary to implement algorithmic strategies in large-scale production settings. A GitHub repository includes data sets and explanatory/exercise Jupyter notebooks. The exercises involve adding the correct code to solve the particular analysis/problem.
Game theory has revolutionised our understanding of industrial organisation and the traditional theory of the firm. Despite these advances, industrial economists have tended to rely on a restricted set of tools from game theory, focusing on static and repeated games to analyse firm structure and behaviour. Luca Lambertini, a leading expert on the application of differential game theory to economics, argues that many dynamic phenomena in industrial organisation (such as monopoly, oligopoly, advertising, R&D races) can be better understood and analysed through the use of differential games. After illustrating the basic elements of the theory, Lambertini guides the reader through the main models, spanning from optimal control problems describing the behaviour of a monopolist through to oligopoly games in which firms' strategies include prices, quantities and investments. This approach will be of great value to students and researchers in economics and those interested in advanced applications of game theory.
Random set theory is a fascinating branch of mathematics that amalgamates techniques from topology, convex geometry, and probability theory. Social scientists routinely conduct empirical work with data and modelling assumptions that reveal a set to which the parameter of interest belongs, but not its exact value. Random set theory provides a coherent mathematical framework to conduct identification analysis and statistical inference in this setting and has become a fundamental tool in econometrics and finance. This is the first book dedicated to the use of the theory in econometrics, written to be accessible for readers without a background in pure mathematics. Molchanov and Molinari define the basics of the theory and illustrate the mathematical concepts by their application in the analysis of econometric models. The book includes sets of exercises to accompany each chapter as well as examples to help readers apply the theory effectively.
Interest in the skew-normal and related families of distributions has grown enormously over recent years, as theory has advanced, challenges of data have grown, and computational tools have made substantial progress. This comprehensive treatment, blending theory and practice, will be the standard resource for statisticians and applied researchers. Assuming only basic knowledge of (non-measure-theoretic) probability and statistical inference, the book is accessible to the wide range of researchers who use statistical modelling techniques. Guiding readers through the main concepts and results, it covers both the probability and the statistics sides of the subject, in the univariate and multivariate settings. The theoretical development is complemented by numerous illustrations and applications to a range of fields including quantitative finance, medical statistics, environmental risk studies, and industrial and business efficiency. The author's freely available R package sn, available from CRAN, equips readers to put the methods into action with their own data.
In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, labour markets, and economic growth.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.
Now in its fifth edition, this book offers a detailed yet concise introduction to the growing field of statistical applications in finance. The reader will learn the basic methods for evaluating option contracts, analyzing financial time series, selecting portfolios and managing risks based on realistic assumptions about market behavior. The focus is both on the fundamentals of mathematical finance and financial time series analysis, and on applications to specific problems concerning financial markets, thus making the book the ideal basis for lectures, seminars and crash courses on the topic. All numerical calculations are transparent and reproducible using quantlets. For this new edition the book has been updated and extensively revised and now includes several new aspects such as neural networks, deep learning, and crypto-currencies. Both R and Matlab code, together with the data, can be downloaded from the book's product page and the Quantlet platform. The Quantlet platform quantlet.de, quantlet.com, quantlet.org is an integrated QuantNet environment consisting of different types of statistics-related documents and program codes. Its goal is to promote reproducibility and offer a platform for sharing validated knowledge native to the social web. QuantNet and the corresponding Data-Driven Documents-based visualization allow readers to reproduce the tables, pictures and calculations inside this Springer book. "This book provides an excellent introduction to the tools from probability and statistics necessary to analyze financial data. Clearly written and accessible, it will be very useful to students and practitioners alike." Yacine Ait-Sahalia, Otto Hack 1903 Professor of Finance and Economics, Princeton University
This is the first of two volumes containing papers and commentaries presented at the Eleventh World Congress of the Econometric Society, held in Montreal, Canada in August 2015. These papers provide state-of-the-art guides to the most important recent research in economics. The book includes surveys and interpretations of key developments in economics and econometrics, and discussion of future directions for a wide variety of topics, covering both theory and application. These volumes provide a unique, accessible survey of progress on the discipline, written by leading specialists in their fields. The first volume includes theoretical and applied papers addressing topics such as dynamic mechanism design, agency problems, and networks.
Factor models have become the most successful tool in the analysis and forecasting of high-dimensional time series. This monograph provides an extensive account of the so-called General Dynamic Factor Model methods. The topics covered include: asymptotic representation problems, estimation, forecasting, identification of the number of factors, identification of structural shocks, volatility analysis, and applications to macroeconomic and financial data. |
You may like...
Autonomy and Unmanned Vehicles…
Somaiyeh MahmoudZadeh, David M. W Powers, …
Hardcover
R3,332
Discovery Miles 33 320
29th European Symposium on Computer…
Anton A Kiss, Edwin Zondervan, …
Hardcover
R11,317
Discovery Miles 113 170
Data Envelopment Analysis with R
Farhad Hosseinzadeh Lotfi, Ali Ebrahimnejad, …
Hardcover
R3,990
Discovery Miles 39 900
|