![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This volume of Advances in Econometrics contains articles that examine key topics in the modeling and estimation of dynamic stochastic general equilibrium (DSGE) models. Because DSGE models combine micro- and macroeconomic theory with formal econometric modeling and inference, over the past decade they have become an established framework for analyzing a variety of issues in empirical macroeconomics. The research articles make contributions in several key areas in DSGE modeling and estimation. In particular, papers cover the modeling and role of expectations, the study of optimal monetary policy in two-country models, and the problem of non-invertibility. Other interesting areas of inquiry include the analysis of parameter identification in new open economy macroeconomic models and the modeling of trend inflation shocks. The second part of the volume is devoted to articles that offer innovations in econometric methodology. These papers advance new techniques for addressing major inferential problems and include discussion and applications of Laplace-type, frequency domain, empirical likelihood and method of moments estimators.
To fully function in today's global real estate industry, students and professionals increasingly need to understand how to implement essential and cutting-edge quantitative techniques. This book presents an easy-to-read guide to applying quantitative analysis in real estate aimed at non-cognate undergraduate and masters students, and meets the requirements of modern professional practice. Through case studies and examples illustrating applications using data sourced from dedicated real estate information providers and major firms in the industry, the book provides an introduction to the foundations underlying statistical data analysis, common data manipulations and understanding descriptive statistics, before gradually building up to more advanced quantitative analysis, modelling and forecasting of real estate markets. Our examples and case studies within the chapters have been specifically compiled for this book and explicitly designed to help the reader acquire a better understanding of the quantitative methods addressed in each chapter. Our objective is to equip readers with the skills needed to confidently carry out their own quantitative analysis and be able to interpret empirical results from academic work and practitioner studies in the field of real estate and in other asset classes. Both undergraduate and masters level students, as well as real estate analysts in the professions, will find this book to be essential reading.
Bayesian econometric methods have enjoyed an increase in popularity in recent years. Econometricians, empirical economists, and policymakers are increasingly making use of Bayesian methods. This handbook is a single source for researchers and policymakers wanting to learn about Bayesian methods in specialized fields, and for graduate students seeking to make the final step from textbook learning to the research frontier. It contains contributions by leading Bayesians on the latest developments in their specific fields of expertise. The volume provides broad coverage of the application of Bayesian econometrics in the major fields of economics and related disciplines, including macroeconomics, microeconomics, finance, and marketing. It reviews the state of the art in Bayesian econometric methodology, with chapters on posterior simulation and Markov chain Monte Carlo methods, Bayesian nonparametric techniques, and the specialized tools used by Bayesian time series econometricians such as state space models and particle filtering. It also includes chapters on Bayesian principles and methodology.
The 'Advances in Econometrics' series aims to publish annual original scholarly econometrics papers on designated topics with the intention of expanding the use of developed and emerging econometric techniques by disseminating ideas on the theory and practice of econometrics throughout the empirical economic, business and social science literature.
The 30th Volume of Advances in Econometrics is in honor of the two individuals whose hard work has helped ensure thirty successful years of the series, Thomas Fomby and R. Carter Hill. This volume began with a history of the Advances series by Asli Ogunc and Randall Campbell summarizing the prior volumes. Tom Fomby and Carter Hill both provide discussions of the role of Advances over the years. The remaining articles include contributions by a number of authors who have played key roles in the series over the years and in the careers of Fomby and Hill. Overall, this leads to a more diverse mix of papers than a typical volume of Advances in Econometrics.
Now in its fourth edition, this landmark text" "provides a fresh, accessible and well-written introduction to the subject. With a rigorous pedagogical framework, which sets it apart from comparable texts, the latest edition features an expanded website providing numerous real life data sets and examples.
Volume 27 of "Advances in Econometrics", entitled "Missing Data Methods", contains 16 chapters authored by specialists in the field, covering topics such as: Missing-Data Imputation in Nonstationary Panel Data Models; Markov Switching Models in Empirical Finance; Bayesian Analysis of Multivariate Sample Selection Models Using Gaussian Copulas; Consistent Estimation and Orthogonality; and Likelihood-Based Estimators for Endogenous or Truncated Samples in Standard Stratified Sampling.
Volume 27 of "Advances in Econometrics", entitled "Missing Data Methods", contains 16 chapters authored by specialists in the field, covering topics such as: Missing-Data Imputation in Nonstationary Panel Data Models; Markov Switching Models in Empirical Finance; Bayesian Analysis of Multivariate Sample Selection Models Using Gaussian Copulas; Consistent Estimation and Orthogonality; and Likelihood-Based Estimators for Endogenous or Truncated Samples in Standard Stratified Sampling.
A number of clubs in professional sports leagues exhibit winning streaks over a number of consecutive seasons that do not conform to the standard economic model of a professional sports league developed by El Hodiri and Quirk (1994) and Fort and Quirk (1995). These clubs seem to display what we call "unsustainable runs," defined as a period of two to four seasons where the club acquires expensive talent and attempts to win a league championship despite not having the market size to sustain such a competitive position in the long run. The standard model predicts that clubs that locate in large economic markets will tend to acquire more talent and achieve more success on the field and at the box office than clubs that are located in small markets.This book builds a model that can allow for unsustainable runs yet retain most of the features of the standard model then subjects it to empirical verification. The new model we develop in the book has as its central feature the ability to generate two equilibria for a club under certain conditions. In the empirical sections of the book, we use time-series analysis to attempt to test for the presence of unsustainable runs using historical data from National Football League (NFL), National Basketball Association (NBA), National Hockey League (NHL) and Major League Baseball (MLB). The multiple equilibria model retains all of the features of the standard model of a professional sports league that is accepted quite universally by economists, yet it offers a much richer approach by including an exploration of the effects of revenues that are earned at the league level (television, apparel, naming rights, etc.) that are then shared by all of the member clubs, making this book very unique and of great interest to scholars in a variety of fields in economics.
The individual risks faced by banks, insurers, and marketers are less well understood than aggregate risks such as market-price changes. But the risks incurred or carried by individual people, companies, insurance policies, or credit agreements can be just as devastating as macroevents such as share-price fluctuations. A comprehensive introduction, The Econometrics of Individual Risk is the first book to provide a complete econometric methodology for quantifying and managing this underappreciated but important variety of risk. The book presents a course in the econometric theory of individual risk illustrated by empirical examples. And, unlike other texts, it is focused entirely on solving the actual individual risk problems businesses confront today. Christian Gourieroux and Joann Jasiak emphasize the microeconometric aspect of risk analysis by extensively discussing practical problems such as retail credit scoring, credit card transaction dynamics, and profit maximization in promotional mailing. They address regulatory issues in sections on computing the minimum capital reserve for coverage of potential losses, and on the credit-risk measure CreditVar. The book will interest graduate students in economics, business, finance, and actuarial studies, as well as actuaries and financial analysts.
This is a thorough exploration of the models and methods of financial econometrics by one of the world's leading financial econometricians and is for students in economics, finance, statistics, mathematics, and engineering who are interested in financial applications. Based on courses taught around the world, the up-to-date content covers developments in econometrics and finance over the last twenty years while ensuring a solid grounding in the fundamental principles of the field. Care has been taken to link theory and application to provide real-world context for students. Worked exercises and empirical examples have also been included to make sure complicated concepts are solidly explained and understood.
Factor models have become the most successful tool in the analysis and forecasting of high-dimensional time series. This monograph provides an extensive account of the so-called General Dynamic Factor Model methods. The topics covered include: asymptotic representation problems, estimation, forecasting, identification of the number of factors, identification of structural shocks, volatility analysis, and applications to macroeconomic and financial data.
This book develops a machine-learning framework for predicting economic growth. It can also be considered as a primer for using machine learning (also known as data mining or data analytics) to answer economic questions. While machine learning itself is not a new idea, advances in computing technology combined with a dawning realization of its applicability to economic questions makes it a new tool for economists.
Principles of Econometrics, 4th Edition, is an introductory book on economics and finance designed to provide an understanding of why econometrics is necessary, and a working knowledge of basic econometric tools. This latest edition is updated to reflect current state of economic and financial markets and provides new content on Kernel Density Fitting and Analysis of Treatment Effects. It offers new end-of-chapters questions and problems in each chapter; updated comprehensive Glossary of Terms; and summary of Probably and Statistics. The text applies basic econometric tools to modeling, estimation, inference, and forecasting through real world problems and evaluates critically the results and conclusions from others who use basic econometric tools. Furthermore, it provides a foundation and understanding for further study of econometrics and more advanced techniques.
This book provides the tools and concepts necessary to study the
behavior of econometric estimators and test statistics in large
samples. An econometric estimator is a solution to an optimization
problem; that is, a problem that requires a body of techniques to
determine a specific solution in a defined set of possible
alternatives that best satisfies a selected object function or set
of constraints. Thus, this highly mathematical book investigates
situations concerning large numbers, in which the assumptions of
the classical linear model fail. Economists, of course, face these
situations often.
Davidson and MacKinnon have written an outstanding textbook for graduates in econometrics, covering both basic and advanced topics and using geometrical proofs throughout for clarity of exposition. The book offers a unified theoretical perspective, and emphasizes the practical applications of modern theory.
Environmental risk directly affects the financial stability of banks since they bear the financial consequences of the loss of liquidity of the entities to which they lend and of the financial penalties imposed resulting from the failure to comply with regulations and for actions taken that are harmful to the natural environment. This book explores the impact of environmental risk on the banking sector and analyzes strategies to mitigate this risk with a special emphasis on the role of modelling. It argues that environmental risk modelling allows banks to estimate the patterns and consequences of environmental risk on their operations, and to take measures within the context of asset and liability management to minimize the likelihood of losses. An important role here is played by the environmental risk modelling methodology as well as the software and mathematical and econometric models used. It examines banks' responses to macroprudential risk, particularly from the point of view of their adaptation strategies; the mechanisms of its spread; risk management and modelling; and sustainable business models. It introduces the basic concepts, definitions, and regulations concerning this type of risk, within the context of its influence on the banking industry. The book is primarily based on a quantitative and qualitative approach and proposes the delivery of a new methodology of environmental risk management and modelling in the banking sector. As such, it will appeal to researchers, scholars, and students of environmental economics, finance and banking, sociology, law, and political sciences.
Central Bank Balance Sheet and Real Business Cycles argues that a deeper comprehension of changes to the central bank balance sheet can lead to more effective policymaking. Any transaction engaged in by the central bank-issuing currency, conducting foreign exchange operations, investing its own funds, intervening to provide emergency liquidity assistance and carrying out monetary policy operations-influences its balance sheet. Despite this, many central banks throughout the world have largely ignored balance sheet movements, and have instead focused on implementing interest rates. In this book, Mustapha Abiodun Akinkunmi highlights the challenges and controversies faced by central banks in the past and present when implementing policies, and analyzes the links between these policies, the central bank balance sheet, and the consequences to economies as a whole. He argues that the composition and evolution of the central bank balance sheet provides a valuable basis for understanding the needs of an economy, and is an important tool in developing strategies that would most effectively achieve policy goals. This book is an important resource for anyone interested in monetary policy or whose work is affected by the actions of the policies of central banks.
"Bayesian Econometrics" illustrates the scope and diversity of modern applications, reviews some recent advances, and highlights many desirable aspects of inference and computations. It begins with an historical overview by Arnold Zellner who describes key contributions to development and makes predictions for future directions. In the second paper, Giordani and Kohn makes suggestions for improving Markov chain Monte Carlo computational strategies. The remainder of the book is categorized according to microeconometric and time-series modeling. Models considered include an endogenous selection ordered probit model, a censored treatment-response model, equilibrium job search models and various other types. These are used to study a variety of applications for example dental insurance and care, educational attainment, voter opinions and the marketing share of various brands and an aggregate cross-section production function. Models and topics considered include the potential problem of improper posterior densities in a variety of dynamic models, selection and averaging for forecasting with vector autoregressions, a consumption capital-asset pricing model and various others. Applications involve U.S. macroeconomic variables, exchange rates, an investigation of purchasing power parity, data from London Metals Exchange, international automobile production data, and data from the Asian stock market.
Measurement in Economics: a Handbook aims to serve as a source,
reference, and teaching supplement for quantitative empirical
economics, inside and outside the laboratory. Covering an extensive
range of fields in economics: econometrics, actuarial science,
experimental economics, index theory, national accounts, and
economic forecasting, it is the first book that takes measurement
in economics as its central focus. It shows how different and
sometimes distinct fields share the same kind of measurement
problems and so how the treatment of these problems in one field
can function as a guidance in other fields. This volume provides
comprehensive and up-to-date surveys of recent developments in
economic measurement, written at a level intended for professional
use by economists, econometricians, statisticians and social
scientists.
Inequality is a charged topic. Measures of income inequality rose in the USA in the 1990s to levels not seen since 1929 and gave rise to a suspicion, not for the first time, of a link between radical inequality and financial instability with a resulting crisis under capitalism. Professional macroeconomists have generally taken little interest in inequality because, within the parameters of traditional economic theory, the economy will stabilize itself at full employment. In addition, enlightened economists could enact stabilizing measures to manage any imbalances. The dominant voices among academic economists were unable to interpret the causal forces at work during both the Great Depression and the recent global financial crisis. In Inequality and Instability, James K. Galbraith argues that since there has been no serious work done on the macroeconomic effects of inequality, new sources of evidence are required. Galbraith offers for the first time a vast expansion of the capacity to calculate measures of inequality both at lower and higher levels of aggregation. Instead of measuring inequality as traditionally done, by country, Galbraith insists that to understand real differences that have real effects, inequality must be examined through both smaller and larger administrative units, like sub-national levels within and between states and provinces, multinational continental economies, and the world. He points out that inequality could be captured by measures across administrative boundaries to capture data on more specific groups to which people belong. For example, in China, economic inequality reflects the difference in average income levels between city and countryside, or between coastal regions and the interior, and a simple ratio averages would be an indicator of trends in inequality over the country as a whole. In a comprehensive presentation of this new method of using data, Inequality and Instability offers an unequaled look at the US economy and various global economies that was not accessible to us before. This provides a more sophisticated and a more accurate picture of inequality around the world, and how inequality is one of the most basic sources of economic instability.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http://iwu.edu/~bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes' theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,” held in Bologna on October 2-3, 2013.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth. |
You may like...
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R8,882
Discovery Miles 88 820
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,987
Discovery Miles 29 870
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
|