Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
It is impossible to understand modern economics without knowledge of the basic tools of gametheory and mechanism design. This book provides a graduate-level introduction to the economic modeling of strategic behavior. The goal is to teach Economics doctoral students the tools of game theory and mechanism design that all economists should know.
Including contributions spanning a variety of theoretical and applied topics in econometrics, this volume of Advances in Econometrics is published in honour of Cheng Hsiao. In the first few chapters of this book, new theoretical panel and time series results are presented, exploring JIVE estimators, HAC, HAR and various sandwich estimators, as well as asymptotic distributions for using information criteria to distinguish between the unit root model and explosive models. Other chapters address topics such as structural breaks or growth empirics; auction models; and semiparametric methods testing for common vs. individual trends. Three chapters provide novel empirical approaches to applied problems, such as estimating the impact of survey mode on responses, or investigating how cross-sectional and spatial dependence of mortgages varies by default rates and geography. In the final chapters, Cheng Hsiao offers a forward-focused discussion of the role of big data in economics. For any researcher of econometrics, this is an unmissable volume of the most current and engaging research in the field.
The Economics and Econometrics of the Energy-Growth Nexus recognizes that research in the energy-growth nexus field is heterogeneous and controversial. To make studies in the field as comparable as possible, chapters cover aggregate energy and disaggregate energy consumption and single country and multiple country analysis. As a foundational resource that helps researchers answer fundamental questions about their energy-growth projects, it combines theory and practice to classify and summarize the literature and explain the econometrics of the energy-growth nexus. The book provides order and guidance, enabling researchers to feel confident that they are adhering to widely accepted assumptions and procedures.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.
For all its elaborate theories and models, economics always reduces to comparisons. Should we build A rather than B? Will I be better off if I eat D rather than C? How much will it cost me to produce F instead of E? At root, the ultimate goal of economics is simple: assessing the alternatives and finding the best possible outcome. This basic mathematical concept underlies all introductions to the field of economics, yet as advanced students progress through the discipline, they often lose track of this foundational idea when presented with real-world complications and uncertainty. In Competitive Agents in Certain and Uncertain Markets, Robert G. Chambers develops an integrated analytic framework for treating consumer, producer, and market equilibrium analyses as special cases of a generic optimization problem. He builds on lessons learned by all beginning students of economics to show how basic concepts can still be applied even in complex and highly uncertain conditions. Drawing from optimization theory, Chambers demonstrates how the same unified mathematical framework applies to both stochastic and non-stochastic decision settings. The book borrows from both convex and variational analysis and gives special emphasis to differentiability, conjugacy theory, and Fenchel's Duality Theorem. Throughout, Chambers includes practical examples, problems, and exercises to make abstract material accessible. Bringing together essential theoretical tools for understanding decision-making under uncertainty, Competitive Agents in Certain and Uncertain Markets provides a unified framework for analyzing a broad range of microeconomic decisions. This book will be an invaluable resource for advanced graduate students and scholars of microeconomic theory.
Probability, Statistics and Econometrics provides a concise, yet rigorous, treatment of the field that is suitable for graduate students studying econometrics, very advanced undergraduate students, and researchers seeking to extend their knowledge of the trinity of fields that use quantitative data in economic decision-making. The book covers much of the groundwork for probability and inference before proceeding to core topics in econometrics. Authored by one of the leading econometricians in the field, it is a unique and valuable addition to the current repertoire of econometrics textbooks and reference books.
The STEM fields -- science, technology, engineering and mathematics -- are the source of tangible innovations in products and processes that help to spur economic growth. Though many of these advances may occur in established organizations, radical innovation has long been associated with entrepreneurial ventures. Several previous studies have shown that high-growth, high-tech STEM-based businesses in the United States are disproportionately founded by foreign-born scientists and engineers. However, recent data also suggest that immigrants rate of participation in U.S. entrepreneurship is slowing. Policies that support nascent immigrant STEM entrepreneurs may also help to improve U.S. employment rates, economic productivity, and career satisfaction among new Americans and legal permanent residents. This book investigates several explanations for differences in STEM entrepreneurship between college-educated native-born and foreign-born workers. It also explores reasons for differences in entrepreneurial participation among foreign-born workers.
This broadly based graduate--level textbook covers the major models and statistical tools currently used in the practice of econometrics. It examines the classical, the decision theory, and the Bayesian approaches, and contains material on single equation and simultaneous equation econometric models. Includes an extensive reference list for each topic.
This book presents a quarter of a century of empirical research on interest rates and a variety of asset prices. It will serve to deepen our understanding of asset price inflation. The book includes extensive analysis of the measurement of interest rates, with case studies from The Netherlands, Belgium and EMU, and emphasizes statistical measurement and the attempt to understand interest rate behaviour through statistical estimation. The book also includes an examination of historical interest rate development in the long run, both theoretically and empirically. In conclusion, Professor Fase also analyses the behaviour of bonds, stocks and investment in art and examines the factors indispensable for a monetary strategy designed to target inflation.
This book is concerned with recent developments in time series and panel data techniques for the analysis of macroeconomic and financial data. It provides a rigorous, nevertheless user-friendly, account of the time series techniques dealing with univariate and multivariate time series models, as well as panel data models. It is distinct from other time series texts in the sense that it also covers panel data models and attempts at a more coherent integration of time series, multivariate analysis, and panel data models. It builds on the author's extensive research in the areas of time series and panel data analysis and covers a wide variety of topics in one volume. Different parts of the book can be used as teaching material for a variety of courses in econometrics. It can also be used as reference manual. It begins with an overview of basic econometric and statistical techniques, and provides an account of stochastic processes, univariate and multivariate time series, tests for unit roots, cointegration, impulse response analysis, autoregressive conditional heteroskedasticity models, simultaneous equation models, vector autoregressions, causality, forecasting, multivariate volatility models, panel data models, aggregation and global vector autoregressive models (GVAR). The techniques are illustrated using Microfit 5 (Pesaran and Pesaran, 2009, OUP) with applications to real output, inflation, interest rates, exchange rates, and stock prices.
This volume uses state of the art models from the frontier of macroeconomics to answer key questions about how the economy functions and how policy should be conducted. The contributions cover a wide range of issues in macroeconomics and macroeconomic policy. They combine high level mathematics with economic analysis, and highlight the need to update our mathematical toolbox in order to understand the increased complexity of the macroeconomic environment. The volume represents hard evidence of high research intensity in many fields of macroeconomics, and warns against interpreting the scope of macroeconomics too narrowly. The mainstream business cycle analysis, based on dynamic stochastic general equilibrium (DSGE) modelling of a particular type, has been criticised for its inability to predict or resolve the recent financial crisis. However, macroeconomic research on financial, information, and learning imperfections had not yet made their way into many of the pre-crisis DSGE models because practical econometric versions of those models were mainly designed to fit data periods that did not include financial crises. A major response to the limitations of those older DSGE models is an active research program to bring big financial shocks and various kinds of financial, learning, and labour market frictions into a new generation of DSGE models for guiding policy. The contributors to this book utilise models and modelling assumptions that go beyond particular modelling conventions. By using alternative yet plausible assumptions, they seek to enrich our knowledge and ability to explain macroeconomic phenomena. They contribute to expanding the frontier of macroeconomic knowledge in ways that will prove useful for macroeconomic policy.
Interest in nonparametric methodology has grown considerably over the past few decades, stemming in part from vast improvements in computer hardware and the availability of new software that allows practitioners to take full advantage of these numerically intensive methods. This book is written for advanced undergraduate students, intermediate graduate students, and faculty, and provides a complete teaching and learning course at a more accessible level of theoretical rigor than Racine's earlier book co-authored with Qi Li, Nonparametric Econometrics: Theory and Practice (2007). The open source R platform for statistical computing and graphics is used throughout in conjunction with the R package np. Recent developments in reproducible research is emphasized throughout with appendices devoted to helping the reader get up to speed with R, R Markdown, TeX and Git.
An excellent starting point for graduate-level econometrics, this comprehensive, well-organized and well-written introductory text includes all of the major topic areas of the subject, clearly explained through concepts rather than relying on complex algebra, and carefully pitched at the right level for students who may not already have a strong background in the subject. The text also includes discussion of bootstrap inference in order to aid students in understanding inference based on exact and asymptotic distributions.
Computational Economics: A concise introduction is a comprehensive textbook designed to help students move from the traditional and comparative static analysis of economic models, to a modern and dynamic computational study. The ability to equate an economic problem, to formulate it into a mathematical model and to solve it computationally is becoming a crucial and distinctive competence for most economists. This vital textbook is organized around static and dynamic models, covering both macro and microeconomic topics, exploring the numerical techniques required to solve those models. A key aim of the book is to enable students to develop the ability to modify the models themselves so that, using the MATLAB/Octave codes provided on the book and on the website, students can demonstrate a complete understanding of computational methods. This textbook is innovative, easy to read and highly focused, providing students of economics with the skills needed to understand the essentials of using numerical methods to solve economic problems. It also provides more technical readers with an easy way to cope with economics through modelling and simulation. Later in the book, more elaborate economic models and advanced numerical methods are introduced which will prove valuable to those in more advanced study. This book is ideal for all students of economics, mathematics, computer science and engineering taking classes on Computational or Numerical Economics.
A Practitioner's Guide to Stochastic Frontier Analysis Using Stata provides practitioners in academia and industry with a step-by-step guide on how to conduct efficiency analysis using the stochastic frontier approach. The authors explain in detail how to estimate production, cost, and profit efficiency and introduce the basic theory of each model in an accessible way, using empirical examples that demonstrate the interpretation and application of models. This book also provides computer code, allowing users to apply the models in their own work, and incorporates the most recent stochastic frontier models developed in academic literature. Such recent developments include models of heteroscedasticity and exogenous determinants of inefficiency, scaling models, panel models with time-varying inefficiency, growth models, and panel models that separate firm effects and persistent and transient inefficiency. Immensely helpful to applied researchers, this book bridges the chasm between theory and practice, expanding the range of applications in which production frontier analysis may be implemented.
Economic Elites, Crises, and Democracy analyzes critical topics of contemporaneous capitalism. Andres Solimano, President of the International Center for Globalization and Development, focuses on economic elites and the super rich, the nature of entrepreneurship, the rise of corporates technostructure, the internal fragmentation of the middle class, and the marginalization of the working poor. While examining historical episodes of economic and financial crises from the 19th century to the present, he reviews a variety of related economic theories and policies, including austerity, which have been enacted in attempts to overcome these crises. Solimano also examines patterns of international mobility of capital and knowledge elites along with the rise of global social movements and migration diasporas. The book ends with an analysis of the concept, modalities, and potential areas of the application of economic democracy to reform 21st century global capitalism.
"Structural Macroeconometrics" provides a thorough overview and in-depth exploration of methodologies, models, and techniques used to analyze forces shaping national economies. In this thoroughly revised second edition, David DeJong and Chetan Dave emphasize time series econometrics and unite theoretical and empirical research, while taking into account important new advances in the field. The authors detail strategies for solving dynamic structural models and present the full range of methods for characterizing and evaluating empirical implications, including calibration exercises, method-of-moment procedures, and likelihood-based procedures, both classical and Bayesian. The authors look at recent strides that have been made to enhance numerical efficiency, consider the expanded applicability of dynamic factor models, and examine the use of alternative assumptions involving learning and rational inattention on the part of decision makers. The treatment of methodologies for obtaining nonlinear model representations has been expanded, and linear and nonlinear model representations are integrated throughout the text. The book offers a rich array of implementation algorithms, sample empirical applications, and supporting computer code. "Structural Macroeconometrics" is the ideal textbook for graduate students seeking an introduction to macroeconomics and econometrics, and for advanced students pursuing applied research in macroeconomics. The book's historical perspective, along with its broad presentation of alternative methodologies, makes it an indispensable resource for academics and professionals.
In the memorable words of Ragnar Frisch, econometrics is 'a unification of the theoretical-quantitative and the empirical-quantitative approach to economic problems'. Beginning to take shape in the 1930s and 1940s, econometrics is now recognized as a vital subdiscipline supported by a vast-and still rapidly growing-body of literature. Following the positive reception of The Rise of Econometrics (2013) (978-0-415-61678-2), Routledge now announces a new collection in its Critical Concepts in Economics series. Edited by the author of the field's leading textbook, Panel Data Econometrics brings together in one 'mini library' the best and most influential scholarship. This four-volume set provides an authoritative, one-stop resource to enable users to understand the econometrics of panel data, from both theoretical and applied viewpoints. With a full index and comprehensive introductions to each volume, newly written by the editor, the collection also provides a synoptic view of many current key debates and issues.
An Introductory Econometrics Text Mathematical Statistics for Applied Econometrics covers the basics of statistical inference in support of a subsequent course on classical econometrics. The book shows students how mathematical statistics concepts form the basis of econometric formulations. It also helps them think about statistics as more than a toolbox of techniques. Uses Computer Systems to Simplify Computation The text explores the unifying themes involved in quantifying sample information to make inferences. After developing the necessary probability theory, it presents the concepts of estimation, such as convergence, point estimators, confidence intervals, and hypothesis tests. The text then shifts from a general development of mathematical statistics to focus on applications particularly popular in economics. It delves into matrix analysis, linear models, and nonlinear econometric techniques. Students Understand the Reasons for the Results Avoiding a cookbook approach to econometrics, this textbook develops students' theoretical understanding of statistical tools and econometric applications. It provides them with the foundation for further econometric studies.
A common set of mathematical tools underlies dynamic optimization, dynamic estimation, and filtering. In "Recursive Models of Dynamic Linear Economies," Lars Peter Hansen and Thomas Sargent use these tools to create a class of econometrically tractable models of prices and quantities. They present examples from microeconomics, macroeconomics, and asset pricing. The models are cast in terms of a representative consumer. While Hansen and Sargent demonstrate the analytical benefits acquired when an analysis with a representative consumer is possible, they also characterize the restrictiveness of assumptions under which a representative household justifies a purely aggregative analysis. Based on the 2012 Gorman lectures, the authors unite economic theory with a workable econometrics while going beyond and beneath demand and supply curves for dynamic economies. They construct and apply competitive equilibria for a class of linear-quadratic-Gaussian dynamic economies with complete markets. Their book stresses heterogeneity, aggregation, and how a common structure unites what superficially appear to be diverse applications. An appendix describes MATLAB(r) programs that apply to the book's calculations.
The worlds of Wall Street and The City have always held a certain allure, but in recent years have left an indelible mark on the wider public consciousness and there has been a need to become more financially literate. The quantitative nature of complex financial transactions makes them a fascinating subject area for mathematicians of all types, whether for general interest or because of the enormous monetary rewards on offer. An Introduction to Quantitative Finance concerns financial derivatives - a derivative being a contract between two entities whose value derives from the price of an underlying financial asset - and the probabilistic tools that were developed to analyse them. The theory in the text is motivated by a desire to provide a suitably rigorous yet accessible foundation to tackle problems the author encountered whilst trading derivatives on Wall Street. The book combines an unusual blend of real-world derivatives trading experience and rigorous academic background. Probability provides the key tools for analysing and valuing derivatives. The price of a derivative is closely linked to the expected value of its pay-out, and suitably scaled derivative prices are martingales, fundamentally important objects in probability theory. The prerequisite for mastering the material is an introductory undergraduate course in probability. The book is otherwise self-contained and in particular requires no additional preparation or exposure to finance. It is suitable for a one-semester course, quickly exposing readers to powerful theory and substantive problems. The book may also appeal to students who have enjoyed probability and have a desire to see how it can be applied. Signposts are given throughout the text to more advanced topics and to different approaches for those looking to take the subject further.
Introduction to Econometrics has been written as a core textbook for a first course in econometrics taken by undergraduate or graduate students. It is intended for students taking a single course in econometrics with a view towards doing practical data work. It will also be highly useful for students interested in understanding the basics of econometric theory with a view towards future study of advanced econometrics. To achieve this end, it has a practical emphasis, showing how a wide variety of models can be used with the types of data sets commonly used by economists. However, it also has enough discussion of the underlying econometric theory to give the student a knowledge of the statistical tools used in advanced econometrics courses. Key Features: An extensive collection of web-based supplementary materials is
provided for this title, including: data sets, problem sheets with
worked through answers, empirical projects, sample exercises with
answers, and slides for lecturers.
This book addresses two interrelated problems in economics modelling: non-nested hypothesis testing in econometrics, and regression models with stochastic/random regressors. The primary motivation for this book stems from the nature of econometric models. As an abstraction from reality, each statistical model consists of mathematical relationships and stochastic, behavioural assumptions. In practice, the validity of these assumptions and the adequacy of the mathematical specifications is ascertained through a series of diagnostic and specification tests. Conventional test procedures, however, fail to recognise that economic theory generally provides more than one distinct model to explain any given economic phenomenon.
At the intersection between statistical physics and rigorous econometric analysis, this powerful new framework sheds light on how innovation and competition shape the growth and decline of companies and industries. Analyzing various sources of data including a unique micro level database which collects historic data on the sales of more than 3,000 firms and 50,000 products in 20 countries, the authors introduce and test a model of innovation and proportional growth, which relies on minimal assumptions and accounts for the empirically observed regularities. Through a combination of extensive stochastic simulations and statistical tests, the authors investigate to what extent their simple assumptions are falsified by empirically observable facts. Physicists looking for application of their mathematical and modelling skills to relevant economic problems as well as economists interested in the explorative analysis of extensive data sets and in a physics-orientated way of thinking will find this book a key reference. |
You may like...
|