![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This is a book on deterministic and stochastic Growth Theory and the computational methods needed to produce numerical solutions. Exogenous and endogenous growth models are thoroughly reviewed. Special attention is paid to the use of these models for fiscal and monetary policy analysis. Modern Business Cycle Theory, the New Keynesian Macroeconomics, the class of Dynamic Stochastic General Equilibrium models, can be all considered as special cases of models of economic growth, and they can be analyzed by the theoretical and numerical procedures provided in the textbook. Analytical discussions are presented in full detail. The book is self contained and it is designed so that the student advances in the theoretical and the computational issues in parallel. EXCEL and Matlab files are provided on an accompanying website (see Preface to the Second Edition) to illustrate theoretical results as well as to simulate the effects of economic policy interventions. The structure of these program files is described in "Numerical exercise"-type of sections, where the output of these programs is also interpreted. The second edition corrects a few typographical errors and improves some notation.
The recent financial crisis has heightened the need for appropriate methodologies for managing and monitoring complex risks in financial markets. The measurement, management, and regulation of risks in portfolios composed of credits, credit derivatives, or life insurance contracts is difficult because of the nonlinearities of risk models, dependencies between individual risks, and the several thousands of contracts in large portfolios. The granularity principle was introduced in the Basel regulations for credit risk to solve these difficulties in computing capital reserves. In this book, authors Patrick Gagliardini and Christian Gourieroux provide the first comprehensive overview of the granularity theory and illustrate its usefulness for a variety of problems related to risk analysis, statistical estimation, and derivative pricing in finance and insurance. They show how the granularity principle leads to analytical formulas for risk analysis that are simple to implement and accurate even when the portfolio size is large."
To what extent should anybody who has to make model forecasts generated from detailed data analysis adjust their forecasts based on their own intuition? In this book, Philip Hans Franses, one of Europe's leading econometricians, presents the notion that many publicly available forecasts have experienced an 'expert's touch', and questions whether this type of intervention is useful and if a lighter adjustment would be more beneficial. Covering an extensive research area, this accessible book brings together current theoretical insights and new empirical results to examine expert adjustment from an econometric perspective. The author's analysis is based on a range of real forecasts and the datasets upon which the forecasters relied. The various motivations behind experts' modifications are considered, and guidelines for creating more useful and reliable adjusted forecasts are suggested. This book will appeal to academics and practitioners with an interest in forecasting methodology.
The purpose of this book is to establish a connection between the traditional field of empirical economic research and the emerging area of empirical financial research and to build a bridge between theoretical developments in these areas and their application in practice. Accordingly, it covers broad topics in the theory and application of both empirical economic and financial research, including analysis of time series and the business cycle; different forecasting methods; new models for volatility, correlation and of high-frequency financial data and new approaches to panel regression, as well as a number of case studies. Most of the contributions reflect the state-of-art on the respective subject. The book offers a valuable reference work for researchers, university instructors, practitioners, government officials and graduate and post-graduate students, as well as an important resource for advanced seminars in empirical economic and financial research.
This textbook, now in its second edition, is an introduction to econometrics from the Bayesian viewpoint. It begins with an explanation of the basic ideas of subjective probability and shows how subjective probabilities must obey the usual rules of probability to ensure coherency. It then turns to the definitions of the likelihood function, prior distributions, and posterior distributions. It explains how posterior distributions are the basis for inference and explores their basic properties. The Bernoulli distribution is used as a simple example. Various methods of specifying prior distributions are considered, with special emphasis on subject-matter considerations and exchange ability. The regression model is examined to show how analytical methods may fail in the derivation of marginal posterior distributions, which leads to an explanation of classical and Markov chain Monte Carlo (MCMC) methods of simulation. The latter is proceeded by a brief introduction to Markov chains. The remainder of the book is concerned with applications of the theory to important models that are used in economics, political science, biostatistics, and other applied fields. New to the second edition is a chapter on semiparametric regression and new sections on the ordinal probit, item response, factor analysis, ARCH-GARCH, and stochastic volatility models. The new edition also emphasizes the R programming language, which has become the most widely used environment for Bayesian statistics.
This book presents a novel approach to time series econometrics, which studies the behavior of nonlinear stochastic processes. This approach allows for an arbitrary dependence structure in the increments and provides a generalization with respect to the standard linear independent increments assumption of classical time series models. The book offers a solution to the problem of a general semiparametric approach, which is given by a concept called C-convolution (convolution of dependent variables), and the corresponding theory of convolution-based copulas. Intended for econometrics and statistics scholars with a special interest in time series analysis and copula functions (or other nonparametric approaches), the book is also useful for doctoral students with a basic knowledge of copula functions wanting to learn about the latest research developments in the field.
This book discusses the problem of model choice when the statistical models are separate, also called nonnested. Chapter 1 provides an introduction, motivating examples and a general overview of the problem. Chapter 2 presents the classical or frequentist approach to the problem as well as several alternative procedures and their properties. Chapter 3 explores the Bayesian approach, the limitations of the classical Bayes factors and the proposed alternative Bayes factors to overcome these limitations. It also discusses a significance Bayesian procedure. Lastly, Chapter 4 examines the pure likelihood approach. Various real-data examples and computer simulations are provided throughout the text.
This book presents the reader with new operators and matrices that arise in the area of matrix calculus. The properties of these mathematical concepts are investigated and linked with zero-one matrices such as the commutation matrix. Elimination and duplication matrices are revisited and partitioned into submatrices. Studying the properties of these submatrices facilitates achieving new results for the original matrices themselves. Different concepts of matrix derivatives are presented and transformation principles linking these concepts are obtained. One of these concepts is used to derive new matrix calculus results, some involving the new operators and others the derivatives of the operators themselves. The last chapter contains applications of matrix calculus, including optimization, differentiation of log-likelihood functions, iterative interpretations of maximum likelihood estimators, and a Lagrangian multiplier test for endogeneity.
This volume systematically details both the basic principles and new developments in Data Envelopment Analysis (DEA), offering a solid understanding of the methodology, its uses, and its potential. New material in this edition includes coverage of recent developments that have greatly extended the power and scope of DEA and have lead to new directions for research and DEA uses. Each chapter accompanies its developments with simple numerical examples and discussions of actual applications. The first nine chapters cover the basic principles of DEA, while the final seven chapters provide a more advanced treatment.
With a new author team contributing decades of practical experience, this fully updated and thoroughly classroom-tested second edition textbook prepares students and practitioners to create effective forecasting models and master the techniques of time series analysis. Taking a practical and example-driven approach, this textbook summarises the most critical decisions, techniques and steps involved in creating forecasting models for business and economics. Students are led through the process with an entirely new set of carefully developed theoretical and practical exercises. Chapters examine the key features of economic time series, univariate time series analysis, trends, seasonality, aberrant observations, conditional heteroskedasticity and ARCH models, non-linearity and multivariate time series, making this a complete practical guide. A companion website with downloadable datasets, exercises and lecture slides rounds out the full learning package.
This book assesses how efficient primary and upper primary education is across different states of India considering both output oriented and input oriented measures of technical efficiency. It identifies the most important factors that could produce differential efficiency among the states, including the effects of central grants, school-specific infrastructures, social indicators and policy variables, as well as state-specific factors like per-capita net-state-domestic-product from the service sector, inequality in distribution of income (Gini coefficient), the percentage of people living below the poverty line and the density of population. The study covers the period 2005-06 to 2010-11 and all the states and union territories of India, which are categorized into two separate groups, namely: (i) General Category States (GCS); and (ii) Special Category States (SCS) and Union Territories (UT). It uses non-parametric Data Envelopment Analysis (DEA) and obtains the Technology Closeness Ratio (TCR), measuring whether the maximum output producible from an input bundle by a school within a given group is as high as what could be produced if the school could choose to join the other group. The major departure of this book is its approach to estimating technical efficiency (TE), which does not use a single frontier encompassing all the states and UT, as is done in the available literature. Rather, this method assumes that GCS, SCS and UT are not homogeneous and operate under different fiscal and economic conditions.
Focusing on what actuaries need in practice, this introductory account provides readers with essential tools for handling complex problems and explains how simulation models can be created, used and re-used (with modifications) in related situations. The book begins by outlining the basic tools of modelling and simulation, including a discussion of the Monte Carlo method and its use. Part II deals with general insurance and Part III with life insurance and financial risk. Algorithms that can be implemented on any programming platform are spread throughout and a program library written in R is included. Numerous figures and experiments with R-code illustrate the text. The author's non-technical approach is ideal for graduate students, the only prerequisites being introductory courses in calculus and linear algebra, probability and statistics. The book will also be of value to actuaries and other analysts in the industry looking to update their skills.
This book is a comprehensive introduction of the reader into the simulation and modelling techniques and their application in the management of organisations. The book is rooted in the thorough understanding of systems theory applied to organisations and focuses on how this theory can apply to econometric models used in the management of organisations. The econometric models in this book employ linear and dynamic programming, graph theory, queuing theory, game theory, etc. and are presented and analysed in various fields of application, such as investment management, stock management, strategic decision making, management of production costs and the lifecycle costs of quality and non-quality products, production quality Management, etc.
This edited book contains several state-of-the-art papers devoted to econometrics of risk. Some papers provide theoretical analysis of the corresponding mathematical, statistical, computational, and economical models. Other papers describe applications of the novel risk-related econometric techniques to real-life economic situations. The book presents new methods developed just recently, in particular, methods using non-Gaussian heavy-tailed distributions, methods using non-Gaussian copulas to properly take into account dependence between different quantities, methods taking into account imprecise ("fuzzy") expert knowledge, and many other innovative techniques. This versatile volume helps practitioners to learn how to apply new techniques of econometrics of risk, and researchers to further improve the existing models and to come up with new ideas on how to best take into account economic risks.
Economic Modeling Using Artificial Intelligence Methods examines the application of artificial intelligence methods to model economic data. Traditionally, economic modeling has been modeled in the linear domain where the principles of superposition are valid. The application of artificial intelligence for economic modeling allows for a flexible multi-order non-linear modeling. In addition, game theory has largely been applied in economic modeling. However, the inherent limitation of game theory when dealing with many player games encourages the use of multi-agent systems for modeling economic phenomena. The artificial intelligence techniques used to model economic data include: multi-layer perceptron neural networks radial basis functions support vector machines rough sets genetic algorithm particle swarm optimization simulated annealing multi-agent system incremental learning fuzzy networks Signal processing techniques are explored to analyze economic data, and these techniques are the time domain methods, time-frequency domain methods and fractals dimension approaches. Interesting economic problems such as causality versus correlation, simulating the stock market, modeling and controling inflation, option pricing, modeling economic growth as well as portfolio optimization are examined. The relationship between economic dependency and interstate conflict is explored, and knowledge on how economics is useful to foster peace - and vice versa - is investigated. Economic Modeling Using Artificial Intelligence Methods deals with the issue of causality in the non-linear domain and applies the automatic relevance determination, the evidence framework, Bayesian approach and Granger causality to understand causality and correlation. Economic Modeling Using Artificial Intelligence Methods makes an important contribution to the area of econometrics, and is a valuable source of reference for graduate students, researchers and financial practitioners.
Though globalisation of the world economy is currently a powerful force, people’s international mobility appears to still be very limited. The goal of this book is to improve our knowledge of the true effects of migration flows. It includes contributions by prominent academic researchers analysing the socio-economic impact of migration in a variety of contexts: interconnection of people and trade flows, causes and consequences of capital remittances, understanding the macroeconomic impact of migration and the labour market effects of people’s flows. The latest analytical methodologies are employed in all chapters, while interesting policy guidelines emerge from the investigations. The style of the volume makes it accessible for both non-experts and advanced readers interested in this hot topic of today’s world.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
This book investigates the relationship between environmental degradation and income, focusing on carbon dioxide (CO2) emissions from around the world, to explore the possibility of sustainable development under global warming. Although many researchers have tackled this problem by estimating the Environmental Kuznets Curve (EKC), unlike the approach to sulfur dioxide emissions, there seems to be little consensus about whether EKC is formed with regard to CO2 emissions. Thus, EKC is one of the most controversial issues in the field of environmental economics. This book contributes three points with academic rigor. First, an unbalanced panel dataset containing over 150 countries with the latest CO2 emission data between 1960 and 2010 is constructed. Second, based on this dataset, the CO2 emission-income relationship is analyzed using strict econometric methods such as the dynamic panel model. Third, as it is often pointed out that some factors other than income affect CO2 emission, several variables were added to the estimation model to examine the effects of changes of industrial structure, energy composition, and overseas trade on CO2 emission.
Analyze key indicators more accurately to make smarter market moves The Economic Indicator Handbook helps investors more easily evaluate economic trends, to better inform investment decision making and other key strategic financial planning. Written by a Bloomberg Senior Economist, this book presents a visual distillation of the indicators every investor should follow, with clear explanation of how they're measured, what they mean, and how that should inform investment thinking. The focus on graphics, professional application, Bloomberg terminal functionality, and practicality makes this guide a quick, actionable read that could immediately start improving investment outcomes. Coverage includes gross domestic product, employment data, industrial production, new residential construction, consumer confidence, retail and food service sales, and commodities, plus guidance on the secret indicators few economists know or care about. Past performance can predict future results if you know how to read the indicators. Modern investing requires a careful understanding of the macroeconomic forces that lift and topple markets on a regular basis, and how they shift to move entire economies. This book is a visual guide to recognizing these forces and tracking their behavior, helping investors identify entry and exit points that maximize profit and minimize loss. * Quickly evaluate economic trends * Make more informed investment decisions * Understand the most essential indicators * Translate predictions into profitable actions Savvy market participants know how critical certain indicators are to the formulation of a profitable, effective market strategy. A daily indicator check can inform day-to-day investing, and long-term tracking can result in a stronger, more robust portfolio. For the investor who knows that better information leads to better outcomes, The Economic Indicator Handbook is an exceptionally useful resource.
The Analytic Hierarchy Process (AHP) has been one of the foremost mathematical methods for decision making with multiple criteria and has been widely studied in the operations research literature as well as applied to solve countless real-world problems. This book is meant to introduce and strengthen the readers' knowledge of the AHP, no matter how familiar they may be with the topic. This book provides a concise, yet self-contained, introduction to the AHP that uses a novel and more pedagogical approach. It begins with an introduction to the principles of the AHP, covering the critical points of the method, as well as some of its applications. Next, the book explores further aspects of the method, including the derivation of the priority vector, the estimation of inconsistency, and the use of AHP for group decisions. Each of these is introduced by relaxing initial assumptions. Furthermore, this booklet covers extensions of AHP, which are typically neglected in elementary expositions of the methods. Such extensions concern different numerical representations of preferences and the interval and fuzzy representations of preferences to account for uncertainty. During the whole exposition, an eye is kept on the most recent developments of the method.
Written for a broad audience this book offers a comprehensive account of early warning systems for hydro meteorological disasters such as floods and storms, and for geological disasters such as earthquakes. One major theme is the increasingly important role in early warning systems played by the rapidly evolving fields of space and information technology. The authors, all experts in their respective fields, offer a comprehensive and in-depth insight into the current and future perspectives for early warning systems. The text is aimed at decision-makers in the political arena, scientists, engineers and those responsible for public communication and dissemination of warnings.
This book investigates the existence of stochastic and deterministic convergence of real output per worker and the sources of output (physical capital per worker, human capital per worker, total factor productivity -TFP- and average annual hours worked) in 21 OECD countries over the period 1970-2011. Towards this end, the authors apply a large battery of panel unit root and stationarity tests, all of which are robust to the presence of cross-sectional dependence. The evidence fails to provide clear-cut evidence of convergence dynamics either in real GDP per worker or in the series of the sources of output. Due to some limitations associated with second-generation panel unit root and stationarity tests, the authors further use the more flexible PANIC approach which provides evidence that real GDP per worker, real physical capital per worker, human capital and average annual hours exhibit some degree of deterministic convergence, whereas TFP series display a high degree of stochastic convergence.
This volume addresses advanced DEA methodology and techniques developed for modeling unique new performance evaluation issues. Many numerical examples, real management cases and verbal descriptions make it very valuable for researchers and practitioners.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
Students in both social and natural sciences often seek regression methods to explain the frequency of events, such as visits to a doctor, auto accidents, or new patents awarded. This book provides the most comprehensive and up-to-date account of models and methods to interpret such data. The authors have conducted research in the field for more than twenty-five years. In this book, they combine theory and practice to make sophisticated methods of analysis accessible to researchers and practitioners working with widely different types of data and software in areas such as applied statistics, econometrics, marketing, operations research, actuarial studies, demography, biostatistics, and quantitative social sciences. The book may be used as a reference work on count models or by students seeking an authoritative overview. Complementary material in the form of data sets, template programs, and bibliographic resources can be accessed on the Internet through the authors' homepages. This second edition is an expanded and updated version of the first, with new empirical examples and more than one hundred new references added. The new material includes new theoretical topics, an updated and expanded treatment of cross-section models, coverage of bootstrap-based and simulation-based inference, expanded treatment of time series, multivariate and panel data, expanded treatment of endogenous regressors, coverage of quantile count regression, and a new chapter on Bayesian methods. |
You may like...
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Russian Far East: An Economic…
Gregory L. Freeze, Viktor Ishaev, …
Hardcover
R3,629
Discovery Miles 36 290
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Fiscal Policies in High Debt Euro-Area…
Antonella Cavallo, Pietro Dallari, …
Hardcover
R3,537
Discovery Miles 35 370
|