![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Coverage has been extended to include recent topics. The book again presents a unified treatment of economic theory, with the method of maximum likelihood playing a key role in both estimation and testing. Exercises are included and the book is suitable as a general text for final-year undergraduate and postgraduate students.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
Decision-theoretic ideas can structure the process of inference together with the decision-making that inference supports. Statistical decision theory is the sub-discipline of statistics which explores and develops this structure. Typically, discusion of decision theory within one discipline does not recognise that other disciplines may have considered the same or similar problems. This text, Volume 9 in the prestigious Kendall's Library of Statistics, provides an overview of the main ideas and concepts of statistical decision theory and sets it within the broader concept of decision theory, decision analysis and decision support as they are practised in many disciplines beyond statistics - including artificial intelligence, economics, operational research, philosophy and psychology.
In nonparametric and high-dimensional statistical models, the classical Gauss-Fisher-Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.
Computable general equilibrium (CGE) models play an important role in supporting public-policy making on such issues as trade, climate change and taxation. This significantly revised volume, keeping pace with the next-generation standard CGE model, is the only undergraduate-level introduction of its kind. The volume utilizes a graphical approach to explain the economic theory underlying a CGE model, and provides results from simple, small-scale CGE models to illustrate the links between theory and model outcomes. Its eleven hands-on exercises introduce modelling techniques that are applied to real-world economic problems. Students learn how to integrate their separate fields of economic study into a comprehensive, general equilibrium perspective as they develop their skills as producers or consumers of CGE-based analysis.
Including contributions spanning a variety of theoretical and applied topics in econometrics, this volume of Advances in Econometrics is published in honour of Cheng Hsiao. In the first few chapters of this book, new theoretical panel and time series results are presented, exploring JIVE estimators, HAC, HAR and various sandwich estimators, as well as asymptotic distributions for using information criteria to distinguish between the unit root model and explosive models. Other chapters address topics such as structural breaks or growth empirics; auction models; and semiparametric methods testing for common vs. individual trends. Three chapters provide novel empirical approaches to applied problems, such as estimating the impact of survey mode on responses, or investigating how cross-sectional and spatial dependence of mortgages varies by default rates and geography. In the final chapters, Cheng Hsiao offers a forward-focused discussion of the role of big data in economics. For any researcher of econometrics, this is an unmissable volume of the most current and engaging research in the field.
Nanak Kakwani and Hyun Hwa Son make use of social welfare functions to derive indicators of development relevant to specific social objectives, such as poverty- and inequality-reduction. Arguing that the measurement of development cannot be value-free, the authors assert that if indicators of development are to have policy relevance, they must be assessed on the basis of the social objectives in question. This study develops indicators that are sensitive to both the level and the distribution of individuals' capabilities. The idea of the social welfare function, defined in income space, is extended to the concept of the social well-being function, defined in capability space. Through empirical analysis from selected developing countries, with a particular focus on Brazil, the authors shape techniques appropriate to the analysis of development in different dimensions. The focus of this evidence-based policy analysis is to evaluate alternative policies affecting the capacities of people to enjoy a better life.
This book examines whether continuous-time models in frictionless financial economies can be well approximated by discrete-time models. It specifically looks to answer the question: in what sense and to what extent does the famous Black-Scholes-Merton (BSM) continuous-time model of financial markets idealize more realistic discrete-time models of those markets? While it is well known that the BSM model is an idealization of discrete-time economies where the stock price process is driven by a binomial random walk, it is less known that the BSM model idealizes discrete-time economies whose stock price process is driven by more general random walks. Starting with the basic foundations of discrete-time and continuous-time models, David M. Kreps takes the reader through to this important insight with the goal of lowering the entry barrier for many mainstream financial economists, thus bringing less-technical readers to a better understanding of the connections between BSM and nearby discrete-economies.
This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time series.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
The book's comprehensive coverage on the application of econometric methods to empirical analysis of economic issues is impressive. It uncovers the missing link between textbooks on economic theory and econometrics and highlights the powerful connection between economic theory and empirical analysis perfectly through examples on rigorous experimental design. The use of data sets for estimation derived with the Monte Carlo method helps facilitate the understanding of the role of hypothesis testing applied to economic models. Topics covered in the book are: consumer behavior, producer behavior, market equilibrium, macroeconomic models, qualitative-response models, panel data analysis and time-series analysis. Key econometric models are introduced, specified, estimated and evaluated. The treatment on methods of estimation in econometrics and the discipline of hypothesis testing makes it a must-have for graduate students of economics and econometrics and aids their understanding on how to estimate economic models and evaluate the results in terms of policy implications.
From the 1980s onward, income inequality increased in many advanced countries. It is very difficult to account for the rise in income inequality using the standard labour supply/demand explanation. Fiscal redistribution has become less effective in compensating increasing inequalities since the 1990s. Some of the basic features of redistribution can be explained through the optimal tax framework developed by J. A. Mirrlees in 1971. This Element surveys some of the earlier results in linear and nonlinear taxation and produces some new numerical results. Given the key role of capital income in the overall income inequality, it also considers the optimal taxation of capital income. It examines empirically the relationship between the extent of redistribution and the components of the Mirrlees framework. The redistributive role of factors such as publicly provided private goods, public employment, endogenous wages in the overlapping generations model and income uncertainty are analysed.
The "Theory of Macrojustice", introduced by S.-C. Kolm, is a stimulating contribution to the debate on the macroeconomic income distribution. The solution called "Equal Labour Income Equalisation" (ELIE) is the result of a three stages construction: collective agreement on the scheme of labour income redistribution, collective agreement on the degree of equalisation to be chosen in that framework, individual freedom to exploit his--her personal productive capicities (the source of labour income and the sole basis for taxation). This book is organised as a discussion around four complementary themes: philosophical aspects of macrojustice, economic analysis of macrojustice, combination of ELIE with other targeted tranfers, econometric evaluations of ELIE.
Bayesian econometric methods have enjoyed an increase in popularity in recent years. Econometricians, empirical economists, and policymakers are increasingly making use of Bayesian methods. This handbook is a single source for researchers and policymakers wanting to learn about Bayesian methods in specialized fields, and for graduate students seeking to make the final step from textbook learning to the research frontier. It contains contributions by leading Bayesians on the latest developments in their specific fields of expertise. The volume provides broad coverage of the application of Bayesian econometrics in the major fields of economics and related disciplines, including macroeconomics, microeconomics, finance, and marketing. It reviews the state of the art in Bayesian econometric methodology, with chapters on posterior simulation and Markov chain Monte Carlo methods, Bayesian nonparametric techniques, and the specialized tools used by Bayesian time series econometricians such as state space models and particle filtering. It also includes chapters on Bayesian principles and methodology.
To fully function in today's global real estate industry, students and professionals increasingly need to understand how to implement essential and cutting-edge quantitative techniques. This book presents an easy-to-read guide to applying quantitative analysis in real estate aimed at non-cognate undergraduate and masters students, and meets the requirements of modern professional practice. Through case studies and examples illustrating applications using data sourced from dedicated real estate information providers and major firms in the industry, the book provides an introduction to the foundations underlying statistical data analysis, common data manipulations and understanding descriptive statistics, before gradually building up to more advanced quantitative analysis, modelling and forecasting of real estate markets. Our examples and case studies within the chapters have been specifically compiled for this book and explicitly designed to help the reader acquire a better understanding of the quantitative methods addressed in each chapter. Our objective is to equip readers with the skills needed to confidently carry out their own quantitative analysis and be able to interpret empirical results from academic work and practitioner studies in the field of real estate and in other asset classes. Both undergraduate and masters level students, as well as real estate analysts in the professions, will find this book to be essential reading.
Born of a belief that economic insights should not require much mathematical sophistication, this book proposes novel and parsimonious methods to incorporate ignorance and uncertainty into economic modeling, without complex mathematics. Economics has made great strides over the past several decades in modeling agents' decisions when they are incompletely informed, but many economists believe that there are aspects of these models that are less than satisfactory. Among the concerns are that ignorance is not captured well in most models, that agents' presumed cognitive ability is implausible, and that derived optimal behavior is sometimes driven by the fine details of the model rather than the underlying economics. Compte and Postlewaite lay out a tractable way to address these concerns, and to incorporate plausible limitations on agents' sophistication. A central aspect of the proposed methodology is to restrict the strategies assumed available to agents.
Over the past two decades, experimental economics has moved from a fringe activity to become a standard tool for empirical research. With experimental economics now regarded as part of the basic tool-kit for applied economics, this book demonstrates how controlled experiments can be a useful in providing evidence relevant to economic research. Professors Jacquemet and L'Haridon take the standard model in applied econometrics as a basis to the methodology of controlled experiments. Methodological discussions are illustrated with standard experimental results. This book provides future experimental practitioners with the means to construct experiments that fit their research question, and new comers with an understanding of the strengths and weaknesses of controlled experiments. Graduate students and academic researchers working in the field of experimental economics will be able to learn how to undertake, understand and criticise empirical research based on lab experiments, and refer to specific experiments, results or designs completed with case study applications.
This is a thorough exploration of the models and methods of financial econometrics by one of the world's leading financial econometricians and is for students in economics, finance, statistics, mathematics, and engineering who are interested in financial applications. Based on courses taught around the world, the up-to-date content covers developments in econometrics and finance over the last twenty years while ensuring a solid grounding in the fundamental principles of the field. Care has been taken to link theory and application to provide real-world context for students. Worked exercises and empirical examples have also been included to make sure complicated concepts are solidly explained and understood.
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
Game theory has revolutionised our understanding of industrial organisation and the traditional theory of the firm. Despite these advances, industrial economists have tended to rely on a restricted set of tools from game theory, focusing on static and repeated games to analyse firm structure and behaviour. Luca Lambertini, a leading expert on the application of differential game theory to economics, argues that many dynamic phenomena in industrial organisation (such as monopoly, oligopoly, advertising, R&D races) can be better understood and analysed through the use of differential games. After illustrating the basic elements of the theory, Lambertini guides the reader through the main models, spanning from optimal control problems describing the behaviour of a monopolist through to oligopoly games in which firms' strategies include prices, quantities and investments. This approach will be of great value to students and researchers in economics and those interested in advanced applications of game theory.
In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, labour markets, and economic growth.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.
This is the first of two volumes containing papers and commentaries presented at the Eleventh World Congress of the Econometric Society, held in Montreal, Canada in August 2015. These papers provide state-of-the-art guides to the most important recent research in economics. The book includes surveys and interpretations of key developments in economics and econometrics, and discussion of future directions for a wide variety of topics, covering both theory and application. These volumes provide a unique, accessible survey of progress on the discipline, written by leading specialists in their fields. The first volume includes theoretical and applied papers addressing topics such as dynamic mechanism design, agency problems, and networks. |
You may like...
Handbook of Econometrics, Volume 6B
James J. Heckman, Edward Leamer
Hardcover
R3,274
Discovery Miles 32 740
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
|