![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
This book provides a synthesis of some recent issues and an up-to-date treatment of some of the major important issues in distributional analysis that I have covered in my previous book Ethical Social Index Numbers, which was widely accepted by students, teachers, researchers and practitioners in the area. Wide coverage of on-going and advanced topics and their analytical, articulate and authoritative p- sentation make the book theoretically and methodologically quite contemporary and inclusive, and highly responsive to the practical problems of recent concern. Since many countries of the world are still characterized by high levels of income inequality, Chap. 1 analyzes the problems of income inequality measurement in detail. Poverty alleviation is an overriding goal of development and social policy. To formulate antipoverty policies, research on poverty has mostly focused on inco- based indices. In view of this, a substantive analysis of income-based poverty has been presented in Chap. 2. The subject of Chap. 3 is people's perception about income inequality in terms of deprivation. Since polarization is of current concern to analysts and social decisi- makers, a discussion on polarization is presented in Chap. 4.
Economic theory defines and constrains admissible functional form and functional structure throughout the economy. Constraints on behavioral functions of individual economic agents and on the recursive nesting of those behavioral functions often are derived directly from economic theory. Theoretically implied constraints on the properties of equilibrium stochastic solution paths also are common, although are less directly derived. In both cases, the restrictions on relevant function spaces have implications for econometric modeling and for the choice of hypotheses to be tested and potentially imposed. This book contains state-of-the-art cumulative research and results on functional structure, approximation, and estimation: for (1) individual economic agents, (2) aggregation over those agents, and (3) equilibrium solution stochastic processes.
B: Statistical Theory.
In March 1998 professional colleagues and students of T.N.
Srinivasan joined together at the Festschrift Conference at Yale to
honor his work. The book contains nineteen of the contributions
which were presented, reflecting the four closely related
dimensions of trade and development.
Increasing concerns regarding the world's natural resources and sustainability continue to be a major issue for global development. As a result several political initiatives and strategies for green or resource-efficient growth both on national and international levels have been proposed. A core element of these initiatives is the promotion of an increase of resource or material productivity. This dissertation examines material productivity developments in the OECD and BRICS countries between 1980 and 2008. By applying the concept of convergence stemming from economic growth theory to material productivity the analysis provides insights into both aspects: material productivity developments in general as well potentials for accelerated improvements in material productivity which consequently may allow a reduction of material use globally. The results of the convergence analysis underline the importance of policy-making with regard to technology and innovation policy enabling the production of resource-efficient products and services as well as technology transfer and diffusion.
The field of Computational Economics is a fast growing area. Due to the limitations in analytical modeling, more and more researchers apply numerical methods as a means of problem solving. In tum these quantitative results can be used to make qualitative statements. This volume of the Advanced Series in Theoretical and Applied and Econometrics comprises a selected number of papers in the field of computational economics presented at the Annual Meeting of the Society Economic Dynamics and Control held in Minneapolis, June 1990. The volume covers ten papers dealing with computational issues in Econo metrics, Economics and Optimization. The first five papers in these proceedings are dedicated to numerical issues in econometric estimation. The following three papers are concerned with computational issues in model solving and optimization. The last two papers highlight some numerical techniques for solving micro models. We are sure that Computational Economics will become an important new trend in Economics in the coming decade. Hopefully this volume can be one of the first contributions highlighting this new trend. The Editors H.M. Amman et a1. (eds), Computational Economics and Econometrics, vii. (c) 1992 Kluwer Academic Publishers. PART ONE ECONOMETRICS LIKELIHOOD EVALUATION FOR DYNAMIC LATENT VARIABLES 1 MODELS DAVID F. HENDRY Nuffield College, Oxford, U.K. and JEAN-FRANc;mS RICHARD ISDS, Pittsburgh University, Pittsburgh, PA, U.S.A."
This conference brought together an international group of fisheries economists from academia, business, government, and inter-governmentalagencies, to consider a coordinated project to build an econometric model of the world trade in groundfish. A number of the conference participants had just spent up to six weeks at Memorial University of Newfoundland working and preparing papers on the project. This volume presents the papers that these scholars produced, plus additional papers prepared by other conference participants. In addition, various lectures and discussionswhich were transcribed from tapes made of the proceedings are included. The introductory essay explains the genesis of the conference, describes the approach taken to modelling the groundfish trade, very briefly summarizes the technical papers, and describes future plans. The project is continuing as planned, and a second conference was held in St. John's in August 1990. The conference was a NATO Advanced Research Workshop and we wish to thank the ScientificAffairs Division ofNATO for their financial support. Additional financial support was received from the Canadian Centre for Fisheries Innovation in St. John's, the Department of Fisheries and Oceans of the Government of Canada, the Department of Fisheries of the Government of Newfoundland and Labrador, Memorial University of Newfoundland and Air Nova; we acknowledge with appreciation their help.
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural finance; carbon and green finance; credit risk methods and models; dynamic optimization in finance; financial econometrics; forecasting of dynamical actuarial and financial phenomena; fund performance evaluation; insurance portfolio risk analysis; interest rate models; longevity risk; machine learning and soft-computing in finance; management in insurance business; models and methods for financial time series analysis, models for financial derivatives; multivariate techniques for financial markets analysis; optimization in insurance; pricing; probability in actuarial sciences, insurance and finance; real world finance; risk management; solvency analysis; sovereign risk; static and dynamic portfolio selection and management; trading systems. This book is a valuable resource for academics, PhD students, practitioners, professionals and researchers, and is also of interest to other readers with quantitative background knowledge.
Nonlinear modelling has become increasingly important and widely used in economics. This valuable book brings together recent advances in the area including contributions covering cross-sectional studies of income distribution and discrete choice models, time series models of exchange rate dynamics and jump processes, and artificial neural network and genetic algorithm models of financial markets. Attention is given to the development of theoretical models as well as estimation and testing methods with a wide range of applications in micro and macroeconomics, labour and finance. The book provides valuable introductory material that is accessible to students and scholars interested in this exciting research area, as well as presenting the results of new and original research. Nonlinear Economic Models provides a sequel to Chaos and Nonlinear Models in Economics by the same editors.
New Perspectives in Econometric Theory comprises specially selected papers by Halbert White which reflect his research in a variety of related areas in econometrics: heteroskedasticity of unknown form; nonlinear and nonparametric regression; instrumental variables and generalized method of moments estimation; and measurability and limit theory. In many instances, results from one paper provide the foundation for, or suggest new directions for, research taken up by others in the collection. The intent of collecting these papers together in the present volume, with new commentaries by the author, is to provide access both to a modern unified perspective for econometric theory and to a set of concepts and tools that will be useful to practitioners in the field. As a companion to the first volume entitled Advances in Econometric Theory, this latest selection of Halbert White's work will appeal to academics and researchers in econometrics and economic theory.
This book provides an introductory treatment of time series econometrics, a subject that is of key importance to both students and practitioners of economics. It contains material that any serious student of economics and finance should be acquainted with if they are seeking to gain an understanding of a real functioning economy.
Palgrave Handbook of Econometrics comprises 'landmark' essays by the world's leading scholars and provides authoritative and definitive guidance in key areas of econometrics. With definitive contributions on the subject, the Handbook is an essential source of reference for professional econometricians, economists, researchers and students. Volume I covers developments in theoretical econometrics, including essays on the methodology and history of econometrics, developments in time-series and cross-section econometrics, modelling with integrated variables, Bayesian econometrics, simulation methods and a selection of special topics.
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
The book develops the capabilities arising from the cooperation between mathematicians and statisticians working in insurance and finance fields. It gathers some of the papers presented at the conference MAF2010, held in Ravello (Amalfi coast), and successively, after a reviewing process, worked out to this aim.
Are foreign exchange markets efficient? Are fundamentals important for predicting exchange rate movements? What is the signal-to-ratio of high frequency exchange rate changes? Is it possible to define a measure of the equilibrium exchange rate that is useful from an assessment perspective? The book is a selective survey of current thinking on key topics in exchange rate economics, supplemented throughout by new empirical evidence. The focus is on the use of advanced econometric tools to find answers to these and other questions which are important to practitioners, policy-makers and academic economists. In addition, the book addresses more technical econometric considerations such as the importance of the choice between single-equation and system-wide approaches to modelling the exchange rate, and the reduced form versus structural equation problems. Readers will gain both a comprehensive overview of the way macroeconomists approach exchange rate modelling, and an understanding of how advanced techniques can help them explain and predict the behavior of this crucial economic variable.
This book employs a computable general equilibrium (CGE) model - a widely used economic model which uses actual data to provide economic analysis and policy assessment - and applies it to economic data on Singapore's tourism industry. The authors set out to demonstrate how a novice modeller can acquire the necessary skills and knowledge to successfully apply general equilibrium models to tourism studies. The chapters explain how to build a computable general equilibrium model for tourism, how to conduct simulation and, most importantly, how to analyse modelling results. This applied study acts as a modelling book at both introductory and intermediate levels, specifically targeting students and researchers who are interested in and wish to learn computable general equilibrium modelling. The authors offer insightful analysis of Singapore's tourism industry and provide both students and researchers with a guide on how to apply general equilibrium models to actual economic data and draw accurate conclusions.
The "Contributions to Economic Analysis" series consists of a number of previously unpublished studies. The term economic analysis is used because it covers the activities of the theoretical economist and the research worker.
Financial econometrics is one of the greatest on-going success stories of recent decades, as it has become one of the most active areas of research in econometrics. In this book, Michael Clements presents a clear and logical explanation of the key concepts and ideas of forecasts of economic and financial variables. He shows that forecasts of the single most likely outcome of an economic and financial variable are of limited value. Forecasts that provide more information on the expected likely ranges of outcomes are more relevant. This book provides a comprehensive treatment of the evaluation of different types of forecasts and draws out the parallels between the different approaches. It describes the methods of evaluating these more complex forecasts which provide a fuller description of the range of possible future outcomes.
This volume is centered around the issue of market design and resulting market dynamics. The economic crisis of 2007-2009 has once again highlighted the importance of a proper design of market protocols and institutional details for economic dynamics and macroeconomics. Papers in this volume capture institutional details of particular markets, behavioral details of agents' decision making as well as spillovers between markets and effects to the macroeconomy. Computational methods are used to replicate and understand market dynamics emerging from interaction of heterogeneous agents, and to develop models that have predictive power for complex market dynamics. Finally treatments of overlapping generations models and differential games with heterogeneous actors are provided.
This guide is for practicing statisticians and data scientists who use IBM SPSS for statistical analysis of big data in business and finance. This is the first of a two-part guide to SPSS for Windows, introducing data entry into SPSS, along with elementary statistical and graphical methods for summarizing and presenting data. Part I also covers the rudiments of hypothesis testing and business forecasting while Part II will present multivariate statistical methods, more advanced forecasting methods, and multivariate methods. IBM SPSS Statistics offers a powerful set of statistical and information analysis systems that run on a wide variety of personal computers. The software is built around routines that have been developed, tested, and widely used for more than 20 years. As such, IBM SPSS Statistics is extensively used in industry, commerce, banking, local and national governments, and education. Just a small subset of users of the package include the major clearing banks, the BBC, British Gas, British Airways, British Telecom, the Consumer Association, Eurotunnel, GSK, TfL, the NHS, Shell, Unilever, and W.H.S. Although the emphasis in this guide is on applications of IBM SPSS Statistics, there is a need for users to be aware of the statistical assumptions and rationales underpinning correct and meaningful application of the techniques available in the package; therefore, such assumptions are discussed, and methods of assessing their validity are described. Also presented is the logic underlying the computation of the more commonly used test statistics in the area of hypothesis testing. Mathematical background is kept to a minimum.
Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.
The book discusses the mechanisms by which securities are traded, as well as examining economic models of asymmetric information, inventory control, and cost-minimizing trading strategies.
The book provides an up-to-date survey of statistical and econometric techniques for the analysis of count data, with a focus on conditional distribution models. The book starts with a presentation of the benchmark Poisson regression model. Alternative models address unobserved heterogeneity, state dependence, selectivity, endogeneity, underreporting, and clustered sampling. Testing and estimation is discussed. Finally, applications are reviewed in various fields.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
Figure 1. 1. Map of Great Britain at two different scale levels. (a) Counties, (b)Regions. '-. " Figure 1. 2. Two alternative aggregations of the Italian provincie in 32 larger areas 4 CHAPTER 1 d . , b) Figure 1. 3 Percentage of votes of the Communist Party in the 1987 Italian political elections (a) and percentage of population over 75 years (b) in 1981 Italian Census in 32 polling districts. The polling districts with values above the average are shaded. Figure 1. 4: First order neighbours (a) and second order neighbours (b) of a reference area. INTRODUCTION 5 While there are several other problems relating to the analysis of areal data, the problem of estimating a spatial correlO!J'am merits special attention. The concept of the correlogram has been borrowed in the spatial literature from the time series analysis. Figure l. 4. a shows the first-order neighbours of a reference area, while Figure 1. 4. b displays the second-order neighbours of the same area. Higher-order neighbours can be defined in a similar fashion. While it is clear that the dependence is strongest between immediate neighbouring areas a certain degree of dependence may be present among higher-order neighbours. This has been shown to be an alternative way of look ing at the sca le problem (Cliff and Ord, 1981, p. l 23). However, unlike the case of a time series where each observation depends only on past observations, here dependence extends in all directions. |
![]() ![]() You may like...
General Galois Geometries
James Hirschfeld, Joseph A. Thas
Hardcover
A Course in Stochastic Processes…
Denis Bosq, Hung T. Nguyen
Hardcover
R5,789
Discovery Miles 57 890
Machine Learning - A Practical Approach…
Rodrigo F Mello, Moacir Antonelli Ponti
Hardcover
R2,929
Discovery Miles 29 290
Discovering Curves and Surfaces with…
Maciej Klimek
Mixed media product
R1,804
Discovery Miles 18 040
|