![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
The papers collected in this volume demonstrate how different kinds of analytical approach can be used to anticipate the economic repercussions of systematic reduction of military spending. This volume will be of interest to economists; scholars in peace studies, international relations and such like; and officials of national governments and international bodies dealing with disarmament issues and with economic restructuring.
This contribution applies the cointegrated vector autoregressive (CVAR) model to analyze the long-run behavior and short-run dynamics of stock markets across five developed and three emerging economies. The main objective is to check whether liquidity conditions play an important role in stock market developments. As an innovation, liquidity conditions enter the analysis from three angles: in the form of a broad monetary aggregate, the interbank overnight rate and net capital flows, which represent the share of global liquidity that arrives in the respective country. A second aim is to understand whether central banks are able to influence the stock market.
The analysis ofwhat might be called "dynamic nonlinearity" in time series has its roots in the pioneering work ofBrillinger (1965) - who first pointed out how the bispectrum and higher order polyspectra could, in principle, be used to test for nonlinear serial dependence - and in Subba Rao and Gabr (1980) and Hinich (1982) who each showed how Brillinger's insight could be translated into a statistical test. Hinich's test, because ittakes advantage ofthe large sample statisticalpropertiesofthe bispectral estimates became the first usable statistical test for nonlinear serial dependence. We are forever grateful to Mel Hinich for getting us involved at that time in this fascinating and fruitful endeavor. With help from Mel (sometimes as amentor, sometimes as acollaborator) we developed and applied this bispectral test in the ensuing period. The first application ofthe test was to daily stock returns {Hinich and Patterson (1982, 1985)} yielding the important discovery of substantial nonlinear serial dependence in returns, over and above the weak linear serial dependence that had been previously observed. The original manuscript met with resistance from finance journals, no doubt because finance academics were reluctant to recognize the importance of distinguishing between serial correlation and nonlinear serial dependence. In Ashley, Patterson and Hinich (1986) we examined the power and sizeofthe test in finite samples.
This book provides a synthesis of some recent issues and an up-to-date treatment of some of the major important issues in distributional analysis that I have covered in my previous book Ethical Social Index Numbers, which was widely accepted by students, teachers, researchers and practitioners in the area. Wide coverage of on-going and advanced topics and their analytical, articulate and authoritative p- sentation make the book theoretically and methodologically quite contemporary and inclusive, and highly responsive to the practical problems of recent concern. Since many countries of the world are still characterized by high levels of income inequality, Chap. 1 analyzes the problems of income inequality measurement in detail. Poverty alleviation is an overriding goal of development and social policy. To formulate antipoverty policies, research on poverty has mostly focused on inco- based indices. In view of this, a substantive analysis of income-based poverty has been presented in Chap. 2. The subject of Chap. 3 is people's perception about income inequality in terms of deprivation. Since polarization is of current concern to analysts and social decisi- makers, a discussion on polarization is presented in Chap. 4.
Increasing concerns regarding the world's natural resources and sustainability continue to be a major issue for global development. As a result several political initiatives and strategies for green or resource-efficient growth both on national and international levels have been proposed. A core element of these initiatives is the promotion of an increase of resource or material productivity. This dissertation examines material productivity developments in the OECD and BRICS countries between 1980 and 2008. By applying the concept of convergence stemming from economic growth theory to material productivity the analysis provides insights into both aspects: material productivity developments in general as well potentials for accelerated improvements in material productivity which consequently may allow a reduction of material use globally. The results of the convergence analysis underline the importance of policy-making with regard to technology and innovation policy enabling the production of resource-efficient products and services as well as technology transfer and diffusion.
Economic theory defines and constrains admissible functional form and functional structure throughout the economy. Constraints on behavioral functions of individual economic agents and on the recursive nesting of those behavioral functions often are derived directly from economic theory. Theoretically implied constraints on the properties of equilibrium stochastic solution paths also are common, although are less directly derived. In both cases, the restrictions on relevant function spaces have implications for econometric modeling and for the choice of hypotheses to be tested and potentially imposed. This book contains state-of-the-art cumulative research and results on functional structure, approximation, and estimation: for (1) individual economic agents, (2) aggregation over those agents, and (3) equilibrium solution stochastic processes.
B: Statistical Theory.
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural finance; carbon and green finance; credit risk methods and models; dynamic optimization in finance; financial econometrics; forecasting of dynamical actuarial and financial phenomena; fund performance evaluation; insurance portfolio risk analysis; interest rate models; longevity risk; machine learning and soft-computing in finance; management in insurance business; models and methods for financial time series analysis, models for financial derivatives; multivariate techniques for financial markets analysis; optimization in insurance; pricing; probability in actuarial sciences, insurance and finance; real world finance; risk management; solvency analysis; sovereign risk; static and dynamic portfolio selection and management; trading systems. This book is a valuable resource for academics, PhD students, practitioners, professionals and researchers, and is also of interest to other readers with quantitative background knowledge.
In March 1998 professional colleagues and students of T.N.
Srinivasan joined together at the Festschrift Conference at Yale to
honor his work. The book contains nineteen of the contributions
which were presented, reflecting the four closely related
dimensions of trade and development.
This book provides a comprehensive and concrete illustration of time series analysis focusing on the state-space model, which has recently attracted increasing attention in a broad range of fields. The major feature of the book lies in its consistent Bayesian treatment regarding whole combinations of batch and sequential solutions for linear Gaussian and general state-space models: MCMC and Kalman/particle filter. The reader is given insight on flexible modeling in modern time series analysis. The main topics of the book deal with the state-space model, covering extensively, from introductory and exploratory methods to the latest advanced topics such as real-time structural change detection. Additionally, a practical exercise using R/Stan based on real data promotes understanding and enhances the reader's analytical capability.
The field of Computational Economics is a fast growing area. Due to the limitations in analytical modeling, more and more researchers apply numerical methods as a means of problem solving. In tum these quantitative results can be used to make qualitative statements. This volume of the Advanced Series in Theoretical and Applied and Econometrics comprises a selected number of papers in the field of computational economics presented at the Annual Meeting of the Society Economic Dynamics and Control held in Minneapolis, June 1990. The volume covers ten papers dealing with computational issues in Econo metrics, Economics and Optimization. The first five papers in these proceedings are dedicated to numerical issues in econometric estimation. The following three papers are concerned with computational issues in model solving and optimization. The last two papers highlight some numerical techniques for solving micro models. We are sure that Computational Economics will become an important new trend in Economics in the coming decade. Hopefully this volume can be one of the first contributions highlighting this new trend. The Editors H.M. Amman et a1. (eds), Computational Economics and Econometrics, vii. (c) 1992 Kluwer Academic Publishers. PART ONE ECONOMETRICS LIKELIHOOD EVALUATION FOR DYNAMIC LATENT VARIABLES 1 MODELS DAVID F. HENDRY Nuffield College, Oxford, U.K. and JEAN-FRANc;mS RICHARD ISDS, Pittsburgh University, Pittsburgh, PA, U.S.A."
Handbook of Alternative Data in Finance, Volume I motivates and challenges the reader to explore and apply Alternative Data in finance. The book provides a robust and in-depth overview of Alternative Data, including its definition, characteristics, difference from conventional data, categories of Alternative Data, Alternative Data providers, and more. The book also offers a rigorous and detailed exploration of process, application and delivery that should be practically useful to researchers and practitioners alike. Features Includes cutting edge applications in machine learning, fintech, and more Suitable for professional quantitative analysts, and as a resource for postgraduates and researchers in financial mathematics Features chapters from many leading researchers and practitioners.
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
Do economics and statistics succeed in explaining human social behaviour? To answer this question. Leland Gerson Neuberg studies some pioneering controlled social experiments. Starting in the late 1960s, economists and statisticians sought to improve social policy formation with random assignment experiments such as those that provided income guarantees in the form of a negative income tax. This book explores anomalies in the conceptual basis of such experiments and in the foundations of statistics and economics more generally. Scientific inquiry always faces certain philosophical problems. Controlled experiments of human social behaviour, however, cannot avoid some methodological difficulties not evident in physical science experiments. Drawing upon several examples, the author argues that methodological anomalies prevent microeconomics and statistics from explaining human social behaviour as coherently as the physical sciences explain nature. He concludes that controlled social experiments are a frequently overrated tool for social policy improvement.
This conference brought together an international group of fisheries economists from academia, business, government, and inter-governmentalagencies, to consider a coordinated project to build an econometric model of the world trade in groundfish. A number of the conference participants had just spent up to six weeks at Memorial University of Newfoundland working and preparing papers on the project. This volume presents the papers that these scholars produced, plus additional papers prepared by other conference participants. In addition, various lectures and discussionswhich were transcribed from tapes made of the proceedings are included. The introductory essay explains the genesis of the conference, describes the approach taken to modelling the groundfish trade, very briefly summarizes the technical papers, and describes future plans. The project is continuing as planned, and a second conference was held in St. John's in August 1990. The conference was a NATO Advanced Research Workshop and we wish to thank the ScientificAffairs Division ofNATO for their financial support. Additional financial support was received from the Canadian Centre for Fisheries Innovation in St. John's, the Department of Fisheries and Oceans of the Government of Canada, the Department of Fisheries of the Government of Newfoundland and Labrador, Memorial University of Newfoundland and Air Nova; we acknowledge with appreciation their help.
This book provides an introductory treatment of time series econometrics, a subject that is of key importance to both students and practitioners of economics. It contains material that any serious student of economics and finance should be acquainted with if they are seeking to gain an understanding of a real functioning economy.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Operations Research and Management.
An introduction to how the mathematical tools from quantum field theory can be applied to economics and finance, providing a wide range of quantum mathematical techniques for designing financial instruments. The ideas of Lagrangians, Hamiltonians, state spaces, operators and Feynman path integrals are demonstrated to be the mathematical underpinning of quantum field theory, and which are employed to formulate a comprehensive mathematical theory of asset pricing as well as of interest rates, which are validated by empirical evidence. Numerical algorithms and simulations are applied to the study of asset pricing models as well as of nonlinear interest rates. A range of economic and financial topics are shown to have quantum mechanical formulations, including options, coupon bonds, nonlinear interest rates, risky bonds and the microeconomic action functional. This is an invaluable resource for experts in quantitative finance and in mathematics who have no specialist knowledge of quantum field theory.
Palgrave Handbook of Econometrics comprises 'landmark' essays by the world's leading scholars and provides authoritative and definitive guidance in key areas of econometrics. With definitive contributions on the subject, the Handbook is an essential source of reference for professional econometricians, economists, researchers and students. Volume I covers developments in theoretical econometrics, including essays on the methodology and history of econometrics, developments in time-series and cross-section econometrics, modelling with integrated variables, Bayesian econometrics, simulation methods and a selection of special topics.
This book employs a computable general equilibrium (CGE) model - a widely used economic model which uses actual data to provide economic analysis and policy assessment - and applies it to economic data on Singapore's tourism industry. The authors set out to demonstrate how a novice modeller can acquire the necessary skills and knowledge to successfully apply general equilibrium models to tourism studies. The chapters explain how to build a computable general equilibrium model for tourism, how to conduct simulation and, most importantly, how to analyse modelling results. This applied study acts as a modelling book at both introductory and intermediate levels, specifically targeting students and researchers who are interested in and wish to learn computable general equilibrium modelling. The authors offer insightful analysis of Singapore's tourism industry and provide both students and researchers with a guide on how to apply general equilibrium models to actual economic data and draw accurate conclusions.
The book develops the capabilities arising from the cooperation between mathematicians and statisticians working in insurance and finance fields. It gathers some of the papers presented at the conference MAF2010, held in Ravello (Amalfi coast), and successively, after a reviewing process, worked out to this aim.
This guide is for practicing statisticians and data scientists who use IBM SPSS for statistical analysis of big data in business and finance. This is the first of a two-part guide to SPSS for Windows, introducing data entry into SPSS, along with elementary statistical and graphical methods for summarizing and presenting data. Part I also covers the rudiments of hypothesis testing and business forecasting while Part II will present multivariate statistical methods, more advanced forecasting methods, and multivariate methods. IBM SPSS Statistics offers a powerful set of statistical and information analysis systems that run on a wide variety of personal computers. The software is built around routines that have been developed, tested, and widely used for more than 20 years. As such, IBM SPSS Statistics is extensively used in industry, commerce, banking, local and national governments, and education. Just a small subset of users of the package include the major clearing banks, the BBC, British Gas, British Airways, British Telecom, the Consumer Association, Eurotunnel, GSK, TfL, the NHS, Shell, Unilever, and W.H.S. Although the emphasis in this guide is on applications of IBM SPSS Statistics, there is a need for users to be aware of the statistical assumptions and rationales underpinning correct and meaningful application of the techniques available in the package; therefore, such assumptions are discussed, and methods of assessing their validity are described. Also presented is the logic underlying the computation of the more commonly used test statistics in the area of hypothesis testing. Mathematical background is kept to a minimum.
Are foreign exchange markets efficient? Are fundamentals important for predicting exchange rate movements? What is the signal-to-ratio of high frequency exchange rate changes? Is it possible to define a measure of the equilibrium exchange rate that is useful from an assessment perspective? The book is a selective survey of current thinking on key topics in exchange rate economics, supplemented throughout by new empirical evidence. The focus is on the use of advanced econometric tools to find answers to these and other questions which are important to practitioners, policy-makers and academic economists. In addition, the book addresses more technical econometric considerations such as the importance of the choice between single-equation and system-wide approaches to modelling the exchange rate, and the reduced form versus structural equation problems. Readers will gain both a comprehensive overview of the way macroeconomists approach exchange rate modelling, and an understanding of how advanced techniques can help them explain and predict the behavior of this crucial economic variable.
The growth rate of national income has fluctuated widely in the United States since 1929. In this volume, Edward F. Denison uses the growth accounting methodology he pioneered and refined in earlier studies to track changes in the trend of output and its determinants. At every step he systematically distinguishes changes in the economy's ability to produce as measured by his series on potential national income from changes in the ratio of actual output to potential output. Using data for earlier years as a backdrop, Denison focuses on the dramatic decline in the growth of potential national income that started in 1974 and was further accentuated beginning in 1980, and on the pronounced decline from business cycle to business cycle in the average ratio of actual to potential output, a slide under way since 1969. The decline in growth rates has been especially pronounced in national income per person employed and other productivity measures as growth of total output has slowed despite a sharp acceleration in growth of employment and total hours at work. Denison organizes his discussion around eight table that divide 1929-82 into three long periods (the last, 1973-82) and seven shorter periods (the most recent, 1973-79 and 1979-82). These tables provide estimates of the sources of growth for eight output measures in each period. Denison stresses that the 1973-82 period of slow growth in unfinished. He observes no improvement in the productivity trend, only a weak cyclical recovery from a 1982 low. Sources-of-growth tables isolate the contributions made to growth between "input" and "output per unit of input." Even so, it is not possible to quantify separately the contribution of all determinants, and Denison evaluates qualitatively the effects of other developments on the productivity slowdown.
This volume is centered around the issue of market design and resulting market dynamics. The economic crisis of 2007-2009 has once again highlighted the importance of a proper design of market protocols and institutional details for economic dynamics and macroeconomics. Papers in this volume capture institutional details of particular markets, behavioral details of agents' decision making as well as spillovers between markets and effects to the macroeconomy. Computational methods are used to replicate and understand market dynamics emerging from interaction of heterogeneous agents, and to develop models that have predictive power for complex market dynamics. Finally treatments of overlapping generations models and differential games with heterogeneous actors are provided.
Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods. |
![]() ![]() You may like...
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,403
Discovery Miles 34 030
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Statistics for Business & Economics…
James McClave, P Benson, …
Paperback
R2,409
Discovery Miles 24 090
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,509
Discovery Miles 25 090
|