Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Understanding why so many people across the world are so poor is one of the central intellectual challenges of our time. This book provides the tools and data that will enable students, researchers and professionals to address that issue. Empirical Development Economics has been designed as a hands-on teaching tool to investigate the causes of poverty. The book begins by introducing the quantitative approach to development economics. Each section uses data to illustrate key policy issues. Part One focuses on the basics of understanding the role of education, technology and institutions in determining why incomes differ so much across individuals and countries. In Part Two, the focus is on techniques to address a number of topics in development, including how firms invest, how households decide how much to spend on their children's education, whether microcredit helps the poor, whether food aid works, who gets private schooling and whether property rights enhance investment. A distinctive feature of the book is its presentation of a range of approaches to studying development questions. Development economics has undergone a major change in focus over the last decade with the rise of experimental methods to address development issues; this book shows how these methods relate to more traditional ones. Please visit the book's website at www.empiricalde.com for online supplements including Stata files and solutions to the exercises.
This Open Access Brief presents the KAPSARC Global Energy Macroeconometric Model (KGEMM). KGEMM is a policy analysis tool for examining the impacts of domestic policy measures and global economic and energy shocks on the Kingdom of Saudi Arabia. The model has eight blocks (real sector, fiscal, monetary, external sector, price, labor and wages, energy, population, and age cohorts) that interact with each other to represent the Kingdom's macroeconomy and energy linkages. It captures New Keynesian demand-side features anchored to medium-run equilibrium and long-run aggregate supply. It applies a cointegration and equilibrium correction modeling (ECM) methodology to time series data to estimate the model's behavioral equations in the framework of Autometrics, a general-to-specific econometric modeling strategy. Hence, the model combines 'theory-driven' approach with 'data-driven' approach. The Brief begins with an introduction to the theoretical framework of the model and the KGEMM methodology and then walks the reader through the structure of the model and its behavioral equations. The book closes with simulations showing the application of the model. Providing a detailed introduction to a cutting-edge, robust predictive model, this Brief will be of great use to researchers and policymakers interested in macroeconomics, energy economics, econometrics, and more specifically, the economy of Saudi Arabia.
This book is a volume in the Penn Press Anniversary Collection. To mark its 125th anniversary in 2015, the University of Pennsylvania Press rereleased more than 1,100 titles from Penn Press's distinguished backlist from 1899-1999 that had fallen out of print. Spanning an entire century, the Anniversary Collection offers peer-reviewed scholarship in a wide range of subject areas.
This Fourth Edition updates the "Solutions Manual for Econometrics" to match the Sixth Edition of the Econometrics textbook. It adds problems and solutions using latest software versions of Stata and EViews. Special features include empirical examples replicated using EViews, Stata as well as SAS. The book offers rigorous proofs and treatment of difficult econometrics concepts in a simple and clear way, and provides the reader with both applied and theoretical econometrics problems along with their solutions. These should prove useful to students and instructors using this book.
The book, Sustainability and Resources: Theoretical Issues in Dynamic Economics, presents a collection of mathematical models dealing with sustainability and resource management.The focus in Part A is on harvesting renewable resources, while Part B explores the optimal extraction of exhaustible resources. Part C introduces models dealing with uncertainty. Some are descriptive models; others have deep roots in intertemporal welfare economics. The tools of dynamic optimization developed in the 1960s are used in a formal, rigorous presentation to address wide-ranging issues that have appeared in academic research as well as policy debates on the world stage.The book also provides a self-contained treatment that is accessible to advanced undergraduate and graduate students, who are interested in dynamic models of resource allocation and social welfare, resource management, and applications of optimization theory and methods of probability theory to economics. For researchers in dynamic economics, it will be an invaluable source for formal treatment of substantive macroeconomic issues raised by policymakers. The part dealing with uncertainty and random dynamical systems (largely developed by the author and his collaborators) exposes the reader to contemporary frontiers of research on stochastic processes with novel applications to economic problems.
Whether you're comping a vocal track, restoring an old recording, working with dialogue or sound effects for film, or imposing your own vision with mash-ups or remixes, audio editing is a key skill to successful sound production. Digital Audio Editing gives you the techniques, from the simplest corrective editing like cutting, copying, and pasting to more complex creative editing, such as beat mapping and time-stretching. You'll be able to avoid unnatural-sounding pitch correction and understand the potential pitfalls you face when restoring classic tracks. Author Simon Langford invites you to see editing with his wide-angle view, putting this skill into a broad context that will inform your choices even as you more skillfully manipulate sound. Focusing on techniques applicable to any digital audio workstation, it includes break-outs giving specific keystrokes and instruction in Avid's Pro Tools, Apple's Logic Pro, Steinberg's Cubase, and PreSonus's Studio One. The companion websites includes tutorials in all four software packages to help you immediately apply the broad skills from the book.
This brief addresses the estimation of quantile regression models from a practical perspective, which will support researchers who need to use conditional quantile regression to measure economic relationships among a set of variables. It will also benefit students using the methodology for the first time, and practitioners at private or public organizations who are interested in modeling different fragments of the conditional distribution of a given variable. The book pursues a practical approach with reference to energy markets, helping readers learn the main features of the technique more quickly. Emphasis is placed on the implementation details and the correct interpretation of the quantile regression coefficients rather than on the technicalities of the method, unlike the approach used in the majority of the literature. All applications are illustrated with R.
This friendly guide is the companion you need to convert pure mathematics into understanding and facility with a host of probabilistic tools. The book provides a high-level view of probability and its most powerful applications. It begins with the basic rules of probability and quickly progresses to some of the most sophisticated modern techniques in use, including Kalman filters, Monte Carlo techniques, machine learning methods, Bayesian inference and stochastic processes. It draws on thirty years of experience in applying probabilistic methods to problems in computational science and engineering, and numerous practical examples illustrate where these techniques are used in the real world. Topics of discussion range from carbon dating to Wasserstein GANs, one of the most recent developments in Deep Learning. The underlying mathematics is presented in full, but clarity takes priority over complete rigour, making this text a starting reference source for researchers and a readable overview for students.
Variations in the foreign exchange market influence all aspects of the world economy, and understanding these dynamics is one of the great challenges of international economics. This book provides a new, comprehensive, and in-depth examination of the standard theories and latest research in exchange-rate economics. Covering a vast swath of theoretical and empirical work, the book explores established theories of exchange-rate determination using macroeconomic fundamentals, and presents unique microbased approaches that combine the insights of microstructure models with the macroeconomic forces driving currency trading. Macroeconomic models have long assumed that agents--households, firms, financial institutions, and central banks--all have the same information about the structure of the economy and therefore hold the same expectations and uncertainties regarding foreign currency returns. Microbased models, however, look at how heterogeneous information influences the trading decisions of agents and becomes embedded in exchange rates. Replicating key features of actual currency markets, these microbased models generate a rich array of empirical predictions concerning trading patterns and exchange-rate dynamics that are strongly supported by data. The models also show how changing macroeconomic conditions exert an influence on short-term exchange-rate dynamics via their impact on currency trading. Designed for graduate courses in international macroeconomics, international finance, and finance, and as a go-to reference for researchers in international economics, "Exchange-Rate Dynamics" guides readers through a range of literature on exchange-rate determination, offering fresh insights for further reading and research.Comprehensive and in-depth examination of the latest research in exchange-rate economics Outlines theoretical and empirical research across the spectrum of modeling approaches Presents new results on the importance of currency trading in exchange-rate determination Provides new perspectives on long-standing puzzles in exchange-rate economics End-of-chapter questions cement key ideas
This book unifies and extends the definition and measurement of economic efficiency and its use as a real-life benchmarking technique for actual organizations. Analytically, the book relies on the economic theory of duality as guiding framework. Empirically, it shows how the alternative models can be implemented by way of Data Envelopment Analysis. An accompanying software programmed in the open-source Julia language is used to solve the models. The package is a self-contained set of functions that can be used for individual learning and instruction. The source code, associated documentation, and replication notebooks are available online. The book discusses the concept of economic efficiency at the firm level, comparing observed to optimal economic performance, and its decomposition according to technical and allocative criteria. Depending on the underlying technical efficiency measure, economic efficiency can be decomposed multiplicatively or additively. Part I of the book deals with the classic multiplicative approach that decomposes cost and revenue efficiency based on radial distance functions. Subsequently, the book examines how these partial approaches can be expanded to the notion of profitability efficiency, considering both the input and output dimensions of the firm, and relying on the generalized distance function for the measurement of technical efficiency. Part II is devoted to the recent additive framework related to the decomposition of economic inefficiency defined in terms of cost, revenue, and profit. The book presents economic models for the Russell and enhanced graph Russell measures, the weighted additive distance function, the directional distance function, the modified directional distance function, and the Hoelder distance function. Each model is presented in a separate chapter. New approaches that qualify and generalize previous results are also introduced in the last chapters, including the reverse directional distance function and the general direct approach. The book concludes by highlighting the importance of benchmarking economic efficiency for all business stakeholders and recalling the main conclusions obtained from many years of research on this topic. The book offers different alternatives to measure economic efficiency based on a set of desirable properties and advises on the choice of specific economic efficiency models.
In this Element and its accompanying second Element, A Practical Introduction to Regression Discontinuity Designs: Extensions, Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an accessible and practical guide for the analysis and interpretation of regression discontinuity (RD) designs that encourages the use of a common set of practices and facilitates the accumulation of RD-based empirical evidence. In this Element, the authors discuss the foundations of the canonical Sharp RD design, which has the following features: (i) the score is continuously distributed and has only one dimension, (ii) there is only one cutoff, and (iii) compliance with the treatment assignment is perfect. In the second Element, the authors discuss practical and conceptual extensions to this basic RD setup.
In the last 20 years, econometric theory on panel data has developed rapidly, particularly for analyzing common behaviors among individuals over time. Meanwhile, the statistical methods employed by applied researchers have not kept up-to-date. This book attempts to fill in this gap by teaching researchers how to use the latest panel estimation methods correctly. Almost all applied economics articles use panel data or panel regressions. However, many empirical results from typical panel data analyses are not correctly executed. This book aims to help applied researchers to run panel regressions correctly and avoid common mistakes. The book explains how to model cross-sectional dependence, how to estimate a few key common variables, and how to identify them. It also provides guidance on how to separate out the long-run relationship and common dynamic and idiosyncratic dynamic relationships from a set of panel data. Aimed at applied researchers who want to learn about panel data econometrics by running statistical software, this book provides clear guidance and is supported by a full range of online teaching and learning materials. It includes practice sections on MATLAB, STATA, and GAUSS throughout, along with short and simple econometric theories on basic panel regressions for those who are unfamiliar with econometric theory on traditional panel regressions.
Applied Econometrics: A Practical Guide is an extremely user-friendly and application-focused book on econometrics. Unlike many econometrics textbooks which are heavily theoretical on abstractions, this book is perfect for beginners and promises simplicity and practicality to the understanding of econometric models. Written in an easy-to-read manner, the book begins with hypothesis testing and moves forth to simple and multiple regression models. It also includes advanced topics: Endogeneity and Two-stage Least Squares Simultaneous Equations Models Panel Data Models Qualitative and Limited Dependent Variable Models Vector Autoregressive (VAR) Models Autocorrelation and ARCH/GARCH Models Unit Root and Cointegration The book also illustrates the use of computer software (EViews, SAS and R) for economic estimating and modeling. Its practical applications make the book an instrumental, go-to guide for solid foundation in the fundamentals of econometrics. In addition, this book includes excerpts from relevant articles published in top-tier academic journals. This integration of published articles helps the readers to understand how econometric models are applied to real-world use cases.
Stata is one of the most popular statistical software in the world and suited for all kinds of users, from absolute beginners to experienced veterans. This book offers a clear and concise introduction to the usage and the workflow of Stata. Included topics are importing and managing datasets, cleaning and preparing data, creating and manipulating variables, producing descriptive statistics and meaningful graphs as well as central quantitative methods, like linear (OLS) and binary logistic regressions and matching. Additional information about diagnostical tests ensures that these methods yield valid and correct results that live up to academic standards. Furthermore, users are instructed how to export results that can be directly used in popular software like Microsoft Word for seminar papers and publications. Lastly, the book offers a short yet focussed introduction to scientific writing, which should guide readers through the process of writing a first quantitative seminar paper or research report. The book underlines correct usage of the software and a productive workflow which also introduces aspects like replicability and general standards for academic writing. While absolute beginners will enjoy the easy to follow point-and-click interface, more experienced users will benefit from the information about do-files and syntax which makes Stata so popular. Lastly, a wide range of user-contributed software ("Ados") is introduced which further improves the general workflow and guarantees the availability of state of the art statistical methods.
This book is the first of its kind to systematically analyze and apply Lim Chong Yah's S-Curve Hypothesis to the various facets of economic growth and economic transition. By augmenting the mathematical and economical sophistication of the hypothesis, this book extends the S-Curve hypothesis to provide further insight into economic growth and transition. It also utilizes a construction of a stochastic growth model to provide the microeconomic foundation for the S-Curve hypothesis. This model resolves the puzzle of why some developing countries experience economic take-off, while others do not. The book analyzes and extends discussion on the S-Curve, and also applies the S-Curve hypothesis to predict long-term growth in Japan and Singapore. It serves as an excellent resource for people interested in Lim's growth theory.
Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy analysis at central banks. This book introduces readers to state-of-the-art computational techniques used in the Bayesian analysis of DSGE models. The book covers Markov chain Monte Carlo techniques for linearized DSGE models, novel sequential Monte Carlo methods that can be used for parameter inference, and the estimation of nonlinear DSGE models based on particle filter approximations of the likelihood function. The theoretical foundations of the algorithms are discussed in depth, and detailed empirical applications and numerical illustrations are provided. The book also gives invaluable advice on how to tailor these algorithms to specific applications and assess the accuracy and reliability of the computations. Bayesian Estimation of DSGE Models is essential reading for graduate students, academic researchers, and practitioners at policy institutions.
This important new dictionary - the first of its kind, now available in paperback - presents an accessible source of reference on the main concepts and techniques in econometrics. Featuring entries on all the major areas in theoretical econometrics, the dictionary will be used by students, both undergraduate and postgraduate, to aid their understanding of the subject. Sorted by alphabetical order, each entry is a short essay which is designed to present the essential points of a particular concept or technique and offer a concise guide to other relevant literature. Written in an accessible and discursive style, the book adopts non-technical language to make the topics accessible to those who need to know more about applied econometrics and the underlying econometric theory. It will be widely welcomed as an indispensable supplement to the standard textbook literature and will be particularly well suited to students following modular courses. An essential source of reference for both undergraduate and post graduate students, the dictionary will also be useful for professional economists seeking to keep abreast of the latest developments in econometrics.
Optimal Transport Methods in Economics is the first textbook on the subject written especially for students and researchers in economics. Optimal transport theory is used widely to solve problems in mathematics and some areas of the sciences, but it can also be used to understand a range of problems in applied economics, such as the matching between job seekers and jobs, the determinants of real estate prices, and the formation of matrimonial unions. This is the first text to develop clear applications of optimal transport to economic modeling, statistics, and econometrics. It covers the basic results of the theory as well as their relations to linear programming, network flow problems, convex analysis, and computational geometry. Emphasizing computational methods, it also includes programming examples that provide details on implementation. Applications include discrete choice models, models of differential demand, and quantile-based statistical estimation methods, as well as asset pricing models. Authoritative and accessible, Optimal Transport Methods in Economics also features numerous exercises throughout that help you develop your mathematical agility, deepen your computational skills, and strengthen your economic intuition. * The first introduction to the subject written especially for economists* Includes programming examples* Features numerous exercises throughout* Ideal for students and researchers alike
Leonid Hurwicz (1917-2008) was a major figure in modern theoretical economics whose contributions over sixty-five years spanned at least five areas: econometrics, nonlinear programming, decision theory, microeconomic theory, and mechanism design. In 2007, at age ninety, he received the Nobel Memorial Prize in Economics (shared with Eric Maskin and Roger Myerson) for pioneering the field of mechanism design and incentive compatibility. Hurwicz made seminal contributions in the other areas as well. In non-linear programming, he contributed to the understanding of Lagrange-Kuhn-Tucker problems (along with co-authors Kenneth Arrowand Hirofumi Uzawa). In econometrics, the Hurwicz bias in the least-squares analysis of time series is a fundamental and commonly cited benchmark. In decision theory, the Hurwicz criterion for decision-making under ambiguity is routinely invoked, sometimes without a citation since his original paper was never published. In microeconomic theory, Hurwicz (along with Arrow and H.D. Block) initiated the study of stability of the market mechanism, and (with Uzawa) solved the classic integrability of demand problem, a core result in neoclassical consumer theory. While some of Hurwicz's work were published in journals, many remain scattered as chapters in books which are difficult to access; yet others were never published at all. The Collected Papers of Leonid Hurwicz is the first volume in a series of four that will bring his oeuvre in one place, to bring to light the totality of his intellectual output, to document his contribution to economics and the extent of his legacy, with the express purpose to make it easily available for future generations of researchers to build upon.
This book makes indicators more accessible, in terms of what they are, who created them and how they are used. It examines the subjectivity and human frailty behind these quintessentially 'hard' and technical measures of the world. To achieve this goal, The Rise and Rise of Indicators presents the world in terms of a selected set of indicators. The emphasis is upon the origins of the indicators and the motivation behind their creation and evolution. The ideas and assumptions behind the indicators are made transparent to demonstrate how changes to them can dramatically alter the ranking of countries that emerge. They are, after all, human constructs and thus embody human biases. The book concludes by examining the future of indicators and the author sets out some possible trajectories, including the growing emphasis on indicators as important tools in the Sustainable Development Goals that have been set for the world up until 2030. This is a valuable resource for undergraduate and postgraduate students in the areas of economics, sociology, geography, environmental studies, development studies, area studies, business studies, politics and international relations.
Continuous-Time Models in Corporate Finance synthesizes four decades of research to show how stochastic calculus can be used in corporate finance. Combining mathematical rigor with economic intuition, Santiago Moreno-Bromberg and Jean-Charles Rochet analyze corporate decisions such as dividend distribution, the issuance of securities, and capital structure and default. They pay particular attention to financial intermediaries, including banks and insurance companies. The authors begin by recalling the ways that option-pricing techniques can be employed for the pricing of corporate debt and equity. They then present the dynamic model of the trade-off between taxes and bankruptcy costs and derive implications for optimal capital structure. The core chapter introduces the workhorse liquidity-management model--where liquidity and risk management decisions are made in order to minimize the costs of external finance. This model is used to study corporate finance decisions and specific features of banks and insurance companies. The book concludes by presenting the dynamic agency model, where financial frictions stem from the lack of interest alignment between a firm's manager and its financiers. The appendix contains an overview of the main mathematical tools used throughout the book. Requiring some familiarity with stochastic calculus methods, Continuous-Time Models in Corporate Finance will be useful for students, researchers, and professionals who want to develop dynamic models of firms' financial decisions.
The ongoing Greek crisis has been the subject of immense scholarly interest and debate since it erupted in 2009. Vast amounts of research from a number of disciplines have attempted to explain the causes of the crisis, with a great variety of approaches adopted in doing so. Unfortunately, there has been little effort to develop a comprehensive cross-disciplinary framework for understanding how the crisis came about. This study has bridged the divide by developing such a cross-disciplinary conceptual model for the causes of the Greek crisis. The literature review process revealed that studies from the political science, public administration, economics, financial economics and monetary economics disciplines contained a range of explanations for the occurrence of the Greek crisis. Qualitative content analysis techniques were used to synthesise the findings from these five fields into a cross-disciplinary conceptual model. By integrating the findings from the five disciplines above, a number of new insights were generated. Firstly, it was found that the crisis manifested primarily as a collapse of confidence in the ability of the Greek state to pay its debts. Secondly, that high sovereign debt levels, internal political opposition to reform, a deterioration of competitiveness of the Greek economy, the existence of destructive political institutions and the possibility of an exit from the European Monetary Union acted as key causes (amongst others) for the collapse of confidence in Greek sovereign bonds. Finally, a number of implications for policy makers in Greece and elsewhere were found and elaborated upon.
This book presents the principles and methods for the practical analysis and prediction of economic and financial time series. It covers decomposition methods, autocorrelation methods for univariate time series, volatility and duration modeling for financial time series, and multivariate time series methods, such as cointegration and recursive state space modeling. It also includes numerous practical examples to demonstrate the theory using real-world data, as well as exercises at the end of each chapter to aid understanding. This book serves as a reference text for researchers, students and practitioners interested in time series, and can also be used for university courses on econometrics or computational finance. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,432
Discovery Miles 64 320
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|