Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This contribution applies the cointegrated vector autoregressive (CVAR) model to analyze the long-run behavior and short-run dynamics of stock markets across five developed and three emerging economies. The main objective is to check whether liquidity conditions play an important role in stock market developments. As an innovation, liquidity conditions enter the analysis from three angles: in the form of a broad monetary aggregate, the interbank overnight rate and net capital flows, which represent the share of global liquidity that arrives in the respective country. A second aim is to understand whether central banks are able to influence the stock market.
The analysis ofwhat might be called "dynamic nonlinearity" in time series has its roots in the pioneering work ofBrillinger (1965) - who first pointed out how the bispectrum and higher order polyspectra could, in principle, be used to test for nonlinear serial dependence - and in Subba Rao and Gabr (1980) and Hinich (1982) who each showed how Brillinger's insight could be translated into a statistical test. Hinich's test, because ittakes advantage ofthe large sample statisticalpropertiesofthe bispectral estimates became the first usable statistical test for nonlinear serial dependence. We are forever grateful to Mel Hinich for getting us involved at that time in this fascinating and fruitful endeavor. With help from Mel (sometimes as amentor, sometimes as acollaborator) we developed and applied this bispectral test in the ensuing period. The first application ofthe test was to daily stock returns {Hinich and Patterson (1982, 1985)} yielding the important discovery of substantial nonlinear serial dependence in returns, over and above the weak linear serial dependence that had been previously observed. The original manuscript met with resistance from finance journals, no doubt because finance academics were reluctant to recognize the importance of distinguishing between serial correlation and nonlinear serial dependence. In Ashley, Patterson and Hinich (1986) we examined the power and sizeofthe test in finite samples.
This book provides a synthesis of some recent issues and an up-to-date treatment of some of the major important issues in distributional analysis that I have covered in my previous book Ethical Social Index Numbers, which was widely accepted by students, teachers, researchers and practitioners in the area. Wide coverage of on-going and advanced topics and their analytical, articulate and authoritative p- sentation make the book theoretically and methodologically quite contemporary and inclusive, and highly responsive to the practical problems of recent concern. Since many countries of the world are still characterized by high levels of income inequality, Chap. 1 analyzes the problems of income inequality measurement in detail. Poverty alleviation is an overriding goal of development and social policy. To formulate antipoverty policies, research on poverty has mostly focused on inco- based indices. In view of this, a substantive analysis of income-based poverty has been presented in Chap. 2. The subject of Chap. 3 is people's perception about income inequality in terms of deprivation. Since polarization is of current concern to analysts and social decisi- makers, a discussion on polarization is presented in Chap. 4.
Increasing concerns regarding the world's natural resources and sustainability continue to be a major issue for global development. As a result several political initiatives and strategies for green or resource-efficient growth both on national and international levels have been proposed. A core element of these initiatives is the promotion of an increase of resource or material productivity. This dissertation examines material productivity developments in the OECD and BRICS countries between 1980 and 2008. By applying the concept of convergence stemming from economic growth theory to material productivity the analysis provides insights into both aspects: material productivity developments in general as well potentials for accelerated improvements in material productivity which consequently may allow a reduction of material use globally. The results of the convergence analysis underline the importance of policy-making with regard to technology and innovation policy enabling the production of resource-efficient products and services as well as technology transfer and diffusion.
Economic theory defines and constrains admissible functional form and functional structure throughout the economy. Constraints on behavioral functions of individual economic agents and on the recursive nesting of those behavioral functions often are derived directly from economic theory. Theoretically implied constraints on the properties of equilibrium stochastic solution paths also are common, although are less directly derived. In both cases, the restrictions on relevant function spaces have implications for econometric modeling and for the choice of hypotheses to be tested and potentially imposed. This book contains state-of-the-art cumulative research and results on functional structure, approximation, and estimation: for (1) individual economic agents, (2) aggregation over those agents, and (3) equilibrium solution stochastic processes.
B: Statistical Theory.
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural finance; carbon and green finance; credit risk methods and models; dynamic optimization in finance; financial econometrics; forecasting of dynamical actuarial and financial phenomena; fund performance evaluation; insurance portfolio risk analysis; interest rate models; longevity risk; machine learning and soft-computing in finance; management in insurance business; models and methods for financial time series analysis, models for financial derivatives; multivariate techniques for financial markets analysis; optimization in insurance; pricing; probability in actuarial sciences, insurance and finance; real world finance; risk management; solvency analysis; sovereign risk; static and dynamic portfolio selection and management; trading systems. This book is a valuable resource for academics, PhD students, practitioners, professionals and researchers, and is also of interest to other readers with quantitative background knowledge.
In March 1998 professional colleagues and students of T.N.
Srinivasan joined together at the Festschrift Conference at Yale to
honor his work. The book contains nineteen of the contributions
which were presented, reflecting the four closely related
dimensions of trade and development.
The field of Computational Economics is a fast growing area. Due to the limitations in analytical modeling, more and more researchers apply numerical methods as a means of problem solving. In tum these quantitative results can be used to make qualitative statements. This volume of the Advanced Series in Theoretical and Applied and Econometrics comprises a selected number of papers in the field of computational economics presented at the Annual Meeting of the Society Economic Dynamics and Control held in Minneapolis, June 1990. The volume covers ten papers dealing with computational issues in Econo metrics, Economics and Optimization. The first five papers in these proceedings are dedicated to numerical issues in econometric estimation. The following three papers are concerned with computational issues in model solving and optimization. The last two papers highlight some numerical techniques for solving micro models. We are sure that Computational Economics will become an important new trend in Economics in the coming decade. Hopefully this volume can be one of the first contributions highlighting this new trend. The Editors H.M. Amman et a1. (eds), Computational Economics and Econometrics, vii. (c) 1992 Kluwer Academic Publishers. PART ONE ECONOMETRICS LIKELIHOOD EVALUATION FOR DYNAMIC LATENT VARIABLES 1 MODELS DAVID F. HENDRY Nuffield College, Oxford, U.K. and JEAN-FRANc;mS RICHARD ISDS, Pittsburgh University, Pittsburgh, PA, U.S.A."
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
This conference brought together an international group of fisheries economists from academia, business, government, and inter-governmentalagencies, to consider a coordinated project to build an econometric model of the world trade in groundfish. A number of the conference participants had just spent up to six weeks at Memorial University of Newfoundland working and preparing papers on the project. This volume presents the papers that these scholars produced, plus additional papers prepared by other conference participants. In addition, various lectures and discussionswhich were transcribed from tapes made of the proceedings are included. The introductory essay explains the genesis of the conference, describes the approach taken to modelling the groundfish trade, very briefly summarizes the technical papers, and describes future plans. The project is continuing as planned, and a second conference was held in St. John's in August 1990. The conference was a NATO Advanced Research Workshop and we wish to thank the ScientificAffairs Division ofNATO for their financial support. Additional financial support was received from the Canadian Centre for Fisheries Innovation in St. John's, the Department of Fisheries and Oceans of the Government of Canada, the Department of Fisheries of the Government of Newfoundland and Labrador, Memorial University of Newfoundland and Air Nova; we acknowledge with appreciation their help.
This book provides an introductory treatment of time series econometrics, a subject that is of key importance to both students and practitioners of economics. It contains material that any serious student of economics and finance should be acquainted with if they are seeking to gain an understanding of a real functioning economy.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Operations Research and Management.
Palgrave Handbook of Econometrics comprises 'landmark' essays by the world's leading scholars and provides authoritative and definitive guidance in key areas of econometrics. With definitive contributions on the subject, the Handbook is an essential source of reference for professional econometricians, economists, researchers and students. Volume I covers developments in theoretical econometrics, including essays on the methodology and history of econometrics, developments in time-series and cross-section econometrics, modelling with integrated variables, Bayesian econometrics, simulation methods and a selection of special topics.
This book employs a computable general equilibrium (CGE) model - a widely used economic model which uses actual data to provide economic analysis and policy assessment - and applies it to economic data on Singapore's tourism industry. The authors set out to demonstrate how a novice modeller can acquire the necessary skills and knowledge to successfully apply general equilibrium models to tourism studies. The chapters explain how to build a computable general equilibrium model for tourism, how to conduct simulation and, most importantly, how to analyse modelling results. This applied study acts as a modelling book at both introductory and intermediate levels, specifically targeting students and researchers who are interested in and wish to learn computable general equilibrium modelling. The authors offer insightful analysis of Singapore's tourism industry and provide both students and researchers with a guide on how to apply general equilibrium models to actual economic data and draw accurate conclusions.
The book develops the capabilities arising from the cooperation between mathematicians and statisticians working in insurance and finance fields. It gathers some of the papers presented at the conference MAF2010, held in Ravello (Amalfi coast), and successively, after a reviewing process, worked out to this aim.
This guide is for practicing statisticians and data scientists who use IBM SPSS for statistical analysis of big data in business and finance. This is the first of a two-part guide to SPSS for Windows, introducing data entry into SPSS, along with elementary statistical and graphical methods for summarizing and presenting data. Part I also covers the rudiments of hypothesis testing and business forecasting while Part II will present multivariate statistical methods, more advanced forecasting methods, and multivariate methods. IBM SPSS Statistics offers a powerful set of statistical and information analysis systems that run on a wide variety of personal computers. The software is built around routines that have been developed, tested, and widely used for more than 20 years. As such, IBM SPSS Statistics is extensively used in industry, commerce, banking, local and national governments, and education. Just a small subset of users of the package include the major clearing banks, the BBC, British Gas, British Airways, British Telecom, the Consumer Association, Eurotunnel, GSK, TfL, the NHS, Shell, Unilever, and W.H.S. Although the emphasis in this guide is on applications of IBM SPSS Statistics, there is a need for users to be aware of the statistical assumptions and rationales underpinning correct and meaningful application of the techniques available in the package; therefore, such assumptions are discussed, and methods of assessing their validity are described. Also presented is the logic underlying the computation of the more commonly used test statistics in the area of hypothesis testing. Mathematical background is kept to a minimum.
Are foreign exchange markets efficient? Are fundamentals important for predicting exchange rate movements? What is the signal-to-ratio of high frequency exchange rate changes? Is it possible to define a measure of the equilibrium exchange rate that is useful from an assessment perspective? The book is a selective survey of current thinking on key topics in exchange rate economics, supplemented throughout by new empirical evidence. The focus is on the use of advanced econometric tools to find answers to these and other questions which are important to practitioners, policy-makers and academic economists. In addition, the book addresses more technical econometric considerations such as the importance of the choice between single-equation and system-wide approaches to modelling the exchange rate, and the reduced form versus structural equation problems. Readers will gain both a comprehensive overview of the way macroeconomists approach exchange rate modelling, and an understanding of how advanced techniques can help them explain and predict the behavior of this crucial economic variable.
This volume is centered around the issue of market design and resulting market dynamics. The economic crisis of 2007-2009 has once again highlighted the importance of a proper design of market protocols and institutional details for economic dynamics and macroeconomics. Papers in this volume capture institutional details of particular markets, behavioral details of agents' decision making as well as spillovers between markets and effects to the macroeconomy. Computational methods are used to replicate and understand market dynamics emerging from interaction of heterogeneous agents, and to develop models that have predictive power for complex market dynamics. Finally treatments of overlapping generations models and differential games with heterogeneous actors are provided.
This book provides a comprehensive and concrete illustration of time series analysis focusing on the state-space model, which has recently attracted increasing attention in a broad range of fields. The major feature of the book lies in its consistent Bayesian treatment regarding whole combinations of batch and sequential solutions for linear Gaussian and general state-space models: MCMC and Kalman/particle filter. The reader is given insight on flexible modeling in modern time series analysis. The main topics of the book deal with the state-space model, covering extensively, from introductory and exploratory methods to the latest advanced topics such as real-time structural change detection. Additionally, a practical exercise using R/Stan based on real data promotes understanding and enhances the reader's analytical capability.
Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.
Financial econometrics is one of the greatest on-going success stories of recent decades, as it has become one of the most active areas of research in econometrics. In this book, Michael Clements presents a clear and logical explanation of the key concepts and ideas of forecasts of economic and financial variables. He shows that forecasts of the single most likely outcome of an economic and financial variable are of limited value. Forecasts that provide more information on the expected likely ranges of outcomes are more relevant. This book provides a comprehensive treatment of the evaluation of different types of forecasts and draws out the parallels between the different approaches. It describes the methods of evaluating these more complex forecasts which provide a fuller description of the range of possible future outcomes.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
The book discusses the mechanisms by which securities are traded, as well as examining economic models of asymmetric information, inventory control, and cost-minimizing trading strategies.
The "Contributions to Economic Analysis" series consists of a number of previously unpublished studies. The term economic analysis is used because it covers the activities of the theoretical economist and the research worker. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Introduction to Econometrics, Global…
James Stock, Mark Watson
Paperback
R2,447
Discovery Miles 24 470
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
|